918 resultados para least squares method
Resumo:
The need for reliable predictions of the solar activity cycle motivates the development of dynamo models incorporating a representation of surface processes sufficiently detailed to allow assimilation of magnetographic data. In this series of papers we present one such dynamo model, and document its behavior and properties. This first paper focuses on one of the model's key components, namely surface magnetic flux evolution. Using a genetic algorithm, we obtain best-fit parameters of the transport model by least-squares minimization of the differences between the associated synthetic synoptic magnetogram and real magnetographic data for activity cycle 21. Our fitting procedure also returns Monte Carlo-like error estimates. We show that the range of acceptable surface meridional flow profiles is in good agreement with Doppler measurements, even though the latter are not used in the fitting process. Using a synthetic database of bipolar magnetic region (BMR) emergences reproducing the statistical properties of observed emergences, we also ascertain the sensitivity of global cycle properties, such as the strength of the dipole moment and timing of polarity reversal, to distinct realizations of BMR emergence, and on this basis argue that this stochasticity represents a primary source of uncertainty for predicting solar cycle characteristics.
Resumo:
Adolescent idiopathic scoliosis (AIS) is a deformity of the spine manifested by asymmetry and deformities of the external surface of the trunk. Classification of scoliosis deformities according to curve type is used to plan management of scoliosis patients. Currently, scoliosis curve type is determined based on X-ray exam. However, cumulative exposure to X-rays radiation significantly increases the risk for certain cancer. In this paper, we propose a robust system that can classify the scoliosis curve type from non invasive acquisition of 3D trunk surface of the patients. The 3D image of the trunk is divided into patches and local geometric descriptors characterizing the surface of the back are computed from each patch and forming the features. We perform the reduction of the dimensionality by using Principal Component Analysis and 53 components were retained. In this work a multi-class classifier is built with Least-squares support vector machine (LS-SVM) which is a kernel classifier. For this study, a new kernel was designed in order to achieve a robust classifier in comparison with polynomial and Gaussian kernel. The proposed system was validated using data of 103 patients with different scoliosis curve types diagnosed and classified by an orthopedic surgeon from the X-ray images. The average rate of successful classification was 93.3% with a better rate of prediction for the major thoracic and lumbar/thoracolumbar types.
Resumo:
One of the major concerns of scoliosis patients undergoing surgical treatment is the aesthetic aspect of the surgery outcome. It would be useful to predict the postoperative appearance of the patient trunk in the course of a surgery planning process in order to take into account the expectations of the patient. In this paper, we propose to use least squares support vector regression for the prediction of the postoperative trunk 3D shape after spine surgery for adolescent idiopathic scoliosis. Five dimensionality reduction techniques used in conjunction with the support vector machine are compared. The methods are evaluated in terms of their accuracy, based on the leave-one-out cross-validation performed on a database of 141 cases. The results indicate that the 3D shape predictions using a dimensionality reduction obtained by simultaneous decomposition of the predictors and response variables have the best accuracy.
Resumo:
Econometrics is a young science. It developed during the twentieth century in the mid-1930’s, primarily after the World War II. Econometrics is the unification of statistical analysis, economic theory and mathematics. The history of econometrics can be traced to the use of statistical and mathematics analysis in economics. The most prominent contributions during the initial period can be seen in the works of Tinbergen and Frisch, and also that of Haavelmo in the 1940's through the mid 1950's. Right from the rudimentary application of statistics to economic data, like the use of laws of error through the development of least squares by Legendre, Laplace, and Gauss, the discipline of econometrics has later on witnessed the applied works done by Edge worth and Mitchell. A very significant mile stone in its evolution has been the work of Tinbergen, Frisch, and Haavelmo in their development of multiple regression and correlation analysis. They used these techniques to test different economic theories using time series data. In spite of the fact that some predictions based on econometric methodology might have gone wrong, the sound scientific nature of the discipline cannot be ignored by anyone. This is reflected in the economic rationale underlying any econometric model, statistical and mathematical reasoning for the various inferences drawn etc. The relevance of econometrics as an academic discipline assumes high significance in the above context. Because of the inter-disciplinary nature of econometrics (which is a unification of Economics, Statistics and Mathematics), the subject can be taught at all these broad areas, not-withstanding the fact that most often Economics students alone are offered this subject as those of other disciplines might not have adequate Economics background to understand the subject. In fact, even for technical courses (like Engineering), business management courses (like MBA), professional accountancy courses etc. econometrics is quite relevant. More relevant is the case of research students of various social sciences, commerce and management. In the ongoing scenario of globalization and economic deregulation, there is the need to give added thrust to the academic discipline of econometrics in higher education, across various social science streams, commerce, management, professional accountancy etc. Accordingly, the analytical ability of the students can be sharpened and their ability to look into the socio-economic problems with a mathematical approach can be improved, and enabling them to derive scientific inferences and solutions to such problems. The utmost significance of hands-own practical training on the use of computer-based econometric packages, especially at the post-graduate and research levels need to be pointed out here. Mere learning of the econometric methodology or the underlying theories alone would not have much practical utility for the students in their future career, whether in academics, industry, or in practice This paper seeks to trace the historical development of econometrics and study the current status of econometrics as an academic discipline in higher education. Besides, the paper looks into the problems faced by the teachers in teaching econometrics, and those of students in learning the subject including effective application of the methodology in real life situations. Accordingly, the paper offers some meaningful suggestions for effective teaching of econometrics in higher education
Resumo:
When variables in time series context are non-negative, such as for volatility, survival time or wave heights, a multiplicative autoregressive model of the type Xt = Xα t−1Vt , 0 ≤ α < 1, t = 1, 2, . . . may give the preferred dependent structure. In this paper, we study the properties of such models and propose methods for parameter estimation. Explicit solutions of the model are obtained in the case of gamma marginal distribution
Resumo:
The classical methods of analysing time series by Box-Jenkins approach assume that the observed series uctuates around changing levels with constant variance. That is, the time series is assumed to be of homoscedastic nature. However, the nancial time series exhibits the presence of heteroscedasticity in the sense that, it possesses non-constant conditional variance given the past observations. So, the analysis of nancial time series, requires the modelling of such variances, which may depend on some time dependent factors or its own past values. This lead to introduction of several classes of models to study the behaviour of nancial time series. See Taylor (1986), Tsay (2005), Rachev et al. (2007). The class of models, used to describe the evolution of conditional variances is referred to as stochastic volatility modelsThe stochastic models available to analyse the conditional variances, are based on either normal or log-normal distributions. One of the objectives of the present study is to explore the possibility of employing some non-Gaussian distributions to model the volatility sequences and then study the behaviour of the resulting return series. This lead us to work on the related problem of statistical inference, which is the main contribution of the thesis
Resumo:
Collective action has been used as a strategy to improve the benefits of smallholder producers of kola nuts in Cameroon. Despite demonstrated benefits, not all producers are involved in the collective action. The presented study used a modified Technology Acceptance Model (TAM) namely the Collective Action Behaviour model (CAB model) to analyse kola producers’ motivation for collective action activities. Five hypotheses are formulated and tested using data obtained from 185 farmers who are involved in kola production and marketing in theWestern highlands of Cameroon. Results which were generated using Partial Least Squares (PLS) approach for Structural Equation Modelling (SEM) showed that farmers’ intrinsic motivators and ease of use influenced their behavioural intent to join a group marketing activities. The perceived usefulness that was mainly related to the economic benefits of group activities did not influence farmers’ behavioural intent. It is therefore concluded that extension messages and promotional activities targeting collective action need to emphasise the perceived ease of use of involvement and social benefits associated with group activities in order to increase farmers’ participation.
Resumo:
This study uses data from a sample survey of 200 households drawn from a mountainous commune in Vietnam’s North Central Coast region to measure and explain relative poverty. Principal components analysis is used to construct a multidimensional index of poverty outcomes from variables measuring household income and the value of domestic assets. This index of poverty is then regressed on likely causes of poverty including different forms of resource endowment and social exclusion defined by gender and ethnicity. The ordinary least squares estimates indicate that poverty is indeed influenced by ethnicity, partly through its interaction with social capital. However, poverty is most strongly affected by differences in human and social capital. Differences in the amount of livestock and high quality farmland owned also matter. Thai households are poorer than their Kinh counterparts even when endowed with the same levels of human, social, physical and natural capital considered in the study. This empirical result provides a rationale for further research on the causal relationship between ethnicity and poverty outcomes.
Resumo:
In the accounting literature, interaction or moderating effects are usually assessed by means of OLS regression and summated rating scales are constructed to reduce measurement error bias. Structural equation models and two-stage least squares regression could be used to completely eliminate this bias, but large samples are needed. Partial Least Squares are appropriate for small samples but do not correct measurement error bias. In this article, disattenuated regression is discussed as a small sample alternative and is illustrated on data of Bisbe and Otley (in press) that examine the interaction effect of innovation and style of use of budgets on performance. Sizeable differences emerge between OLS and disattenuated regression
Resumo:
La tecnología LiDAR (Light Detection and Ranging), basada en el escaneado del territorio por un telémetro láser aerotransportado, permite la construcción de Modelos Digitales de Superficie (DSM) mediante una simple interpolación, así como de Modelos Digitales del Terreno (DTM) mediante la identificación y eliminación de los objetos existentes en el terreno (edificios, puentes o árboles). El Laboratorio de Geomática del Politécnico de Milán – Campus de Como- desarrolló un algoritmo de filtrado de datos LiDAR basado en la interpolación con splines bilineares y bicúbicas con una regularización de Tychonov en una aproximación de mínimos cuadrados. Sin embargo, en muchos casos son todavía necesarios modelos más refinados y complejos en los cuales se hace obligatorio la diferenciación entre edificios y vegetación. Este puede ser el caso de algunos modelos de prevención de riesgos hidrológicos, donde la vegetación no es necesaria; o la modelización tridimensional de centros urbanos, donde la vegetación es factor problemático. (...)
Resumo:
Este tuvo como objetivo describir la tendencia que tuvieron los resultados de los indicadores de calidad técnica y gerencia del riesgo: infecciones intrahospitalarias, mortalidad hospitalaria y reingreso hospitalario desde el año 2006 al 2010 y establecer si existieron diferencias entre las Instituciones de salud, con y sin convenios docente asistenciales tanto públicas y privadas de 11 ciudades de Colombia. Este estudio encontró que posterior a la promulgación de la ley 30 de 1992, el número de programas de medicina se incrementó considerablemente en la última década. Esta situación llevó a considerar dos cosas: primero que el número de Instituciones de práctica en el país puede ser insuficiente ante la gran cantidad de nuevos estudiantes de medicina, con algún grado de hacinamiento en los sitios de práctica y segundo que esa situación puede tener algún efecto sobre la calidad de la atención y el resultado de los indicadores medidos. Se evidenciaron importantes deficiencias técnicas en el reporte obligatorio de la información por parte de los hospitales, donde solo un 10% de ellas cumplieron con el reporte completo de los indicadores desde el año 2006; encontrando que se registra solamente un 65% del total de la información que debería estar publicada. En cuanto al análisis estadístico de los datos, se utilizó el chi cuadrado de tendencias, que no arrojó diferencias estadísticamente significativas de los indicadores en los periodos analizados entre las instituciones con y sin convenios docentes; pero sí evidenció diferencias entre la suma de mínimos cuadrados de dos de los indicadores.
Resumo:
La crisis que se desató en el mercado hipotecario en Estados Unidos en 2008 y que logró propagarse a lo largo de todo sistema financiero, dejó en evidencia el nivel de interconexión que actualmente existe entre las entidades del sector y sus relaciones con el sector productivo, dejando en evidencia la necesidad de identificar y caracterizar el riesgo sistémico inherente al sistema, para que de esta forma las entidades reguladoras busquen una estabilidad tanto individual, como del sistema en general. El presente documento muestra, a través de un modelo que combina el poder informativo de las redes y su adecuación a un modelo espacial auto regresivo (tipo panel), la importancia de incorporar al enfoque micro-prudencial (propuesto en Basilea II), una variable que capture el efecto de estar conectado con otras entidades, realizando así un análisis macro-prudencial (propuesto en Basilea III).
Resumo:
Even though antenatal care is universally regarded as important, determinants of demand for antenatal care have not been widely studied. Evidence concerning which and how socioeconomic conditions influence whether a pregnant woman attends or not at least one antenatal consultation or how these factors affect the absences to antenatal consultations is very limited. In order to generate this evidence, a two-stage analysis was performed with data from the Demographic and Health Survey carried out by Profamilia in Colombia during 2005. The first stage was run as a logit model showing the marginal effects on the probability of attending the first visit and an ordinary least squares model was performed for the second stage. It was found that mothers living in the pacific region as well as young mothers seem to have a lower probability of attending the first visit but these factors are not related to the number of absences to antenatal consultation once the first visit has been achieved. The effect of health insurance was surprising because of the differing effects that the health insurers showed. Some familiar and personal conditions such as willingness to have the last children and number of previous children, demonstrated to be important in the determination of demand. The effect of mother’s educational attainment was proved as important whereas the father’s educational achievement was not. This paper provides some elements for policy making in order to increase the demand inducement of antenatal care, as well as stimulating research on demand for specific issues on health.
Resumo:
Introducción: La depresión se presenta frecuentemente en pacientes con esquizofrenia, en las diferentes fases de esta y, dada su similitud con diferentes síntomas propios de la esquizofrenia, su identificación resulta difícil. El objetivo del presente estudio es determinar la efectividad y seguridad de la Quetiapina en la disminución de síntomas depresivos comparada con otros antipsicóticos de segunda generación en pacientes con esquizofrenia. Metodología: se realizó una revisión sistemática de la literatura en 6 bases de datos de acuerdo con la metodología PRISMA, que incluyó estudios de efectividad y seguridad de la Quetiapina (Seroquel®) en pacientes con esquizofrenia y depresión. Los desenlaces fueron medidos mediante escalas para depresión, tasas de suicidio y efectos adversos. Resultados: Se incluyeron dos estudios. El estudio de Di Fiorino en 2014, reportó efectividad de Quetiapina en el tratamiento de pacientes psicóticos con depresión con una diferencia de las reducciones de mínimos cuadrados de la escala CDSS de 2.2 IC 95% (0.8 -3.7) frente a risperidona, y de 3.3 (p<0.0001) en la escala HAM-D frente a risperidona. Los efectos adversos presentados con quetiapina fueron: somnolencia, boca seca, hipotensión. No se reportaron muertes por quetiapina. Discusión: La quetiapina demostró ser efectiva y segura en el manejo de los síntomas depresivos en pacientes con esquizofrenia frente a risperidona al valorar la población con escala CDSS. No fueron reportados desenlaces de suicidio en ninguno de los estudios. La calidad de la evidencia es moderada con sesgos metodológicos importantes. Es necesaria la realización de más estudios clínicos aleatorizados y con cegamiento y estudios de metaanálisis en población homogénea
Resumo:
We propose and estimate a financial distress model that explicitly accounts for the interactions or spill-over effects between financial institutions, through the use of a spatial continuity matrix that is build from financial network data of inter bank transactions. Such setup of the financial distress model allows for the empirical validation of the importance of network externalities in determining financial distress, in addition to institution specific and macroeconomic covariates. The relevance of such specification is that it incorporates simultaneously micro-prudential factors (Basel 2) as well as macro-prudential and systemic factors (Basel 3) as determinants of financial distress. Results indicate network externalities are an important determinant of financial health of a financial institutions. The parameter that measures the effect of network externalities is both economically and statistical significant and its inclusion as a risk factor reduces the importance of the firm specific variables such as the size or degree of leverage of the financial institution. In addition we analyze the policy implications of the network factor model for capital requirements and deposit insurance pricing.