981 resultados para Log cabins.


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One working day: Reflection on practice: facilitating an occupational therapy student.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analizar el modelo log-linear y sus posibilidades de aplicación en la investigación educativa. Al mismo tiempo se hace un estudio de un medio didáctico: el libro de texto, analizando su relación con la función docente del profesor. 308 profesores de EGB de las dos provincias canarias. El muestreo fue ocasional. Se trata de un diseño ex post facto. Se procedió a la aplicación piloto de un cuestionario en una muestra reducida y a su análisis por parte de un grupo de expertos. Después de las modificaciones oportunas, se procedió a su aplicación final con la colaboración de diversos encuestadores para su distribución y recogida. Las variables principales fueron las siguientes. Rreferidas al profesor: años de experiencia, grado de dependencia del libro de texto y ciclo. Referidas a medios didácticos: frecuencia de uso, razones de uso, finalidad didáctica y dimensiones más valoradas para la enseñanza. Cuestionario 'Uso de medios en la enseñanza'. No existe entre el profesorado, considerado globalmente, una tendencia mayoritaria por la dependencia e independencia hacia el libro de texto. Los profesores veteranos tienden a ser dependientes del libro de texto. El resto de los profesores no se inclinan por la dependencia o independencia. La relación del texto con los programas oficiales sólo es valorada por los profesores del ciclo medio. La dimensión curricular centrada en la metodología que más importancia merece son las actividades que propone el texto, seguida del planteamiento metodológico que se desprende de la guía didáctica. La dimensión más valorada es la adecuación del texto, seguida del planteamiento metodológico que se desprende de la guía didáctica. La dimensión más valorada es la adecuación del texto al nivel de conocimientos de los alumnos. Le sigue en importancia el lenguaje utilizado y, finalmente, los aspectos formales del texto (colorido, tamaño, ilustraciones, etc.). Se pone de manifiesto el papel del libro de texto como un medio destinado básicamente a uso del alumnado. Su uso para el profesor se limita a servir de apoyo en sus explicaciones, en tanto que motivar y evaluar el aprendizaje son funciones con las cuales parece incompatible el uso del libro de texto. El análisis log-linear constituye un poderoso instrumento de análisis de variables nominales, con un grado de sofisticación estadística solo disponible hasta ahora para variables continuas. La abundancia de variables nominales en la investigación educativa, le hace especialmente apropiado para nuestro campo. Las ventajas del análisis log-linear dependen de: la naturaleza de las variables, mínimo número de categorías si se incluyen datos continuos, puntos de corte, estrategias de muestreo, etc. Sigue sin disponerse de criterios claros con respecto al tamaño de la muestra y la interpretación de la intensidad de los parámetros. Tampoco se ha desarrollado un sistema de representación gráfica con esta técnica de análisis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Resumen basado en el de la publicaci??n

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is increasing concern about soil enrichment with K+ and subsequent potential losses following long-term application of poor quality water to agricultural land. Different models are increasingly being used for predicting or analyzing water flow and chemical transport in soils and groundwater. The convective-dispersive equation (CDE) and the convective log-normal transfer function (CLT) models were fitted to the potassium (K+) leaching data. The CDE and CLT models produced equivalent goodness of fit. Simulated breakthrough curves for a range of CaCl2 concentration based on parameters of 15 mmol l(-1) CaCl2 were characterised by an early peak position associated with higher K+ concentration as the CaCl2 concentration used in leaching experiments decreased. In another method, the parameters estimated from 15 mmol l(-1) CaCl2 solution were used for all other CaCl2 concentrations, and the best value of retardation factor (R) was optimised for each data set. A better prediction was found. With decreasing CaCl2 concentration the value of R is required to be more than that measured (except for 10 mmol l(-1) CaCl2), if the estimated parameters of 15 mmol l(-1) CaCl2 are used. The two models suffer from the fact that they need to be calibrated against a data set, and some of their parameters are not measurable and cannot be determined independently.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analyses of high-density single-nucleotide polymorphism (SNP) data, such as genetic mapping and linkage disequilibrium (LD) studies, require phase-known haplotypes to allow for the correlation between tightly linked loci. However, current SNP genotyping technology cannot determine phase, which must be inferred statistically. In this paper, we present a new Bayesian Markov chain Monte Carlo (MCMC) algorithm for population haplotype frequency estimation, particulary in the context of LD assessment. The novel feature of the method is the incorporation of a log-linear prior model for population haplotype frequencies. We present simulations to suggest that 1) the log-linear prior model is more appropriate than the standard coalescent process in the presence of recombination (>0.02cM between adjacent loci), and 2) there is substantial inflation in measures of LD obtained by a "two-stage" approach to the analysis by treating the "best" haplotype configuration as correct, without regard to uncertainty in the recombination process. Genet Epidemiol 25:106-114, 2003. (C) 2003 Wiley-Liss, Inc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For forecasting and economic analysis many variables are used in logarithms (logs). In time series analysis, this transformation is often considered to stabilize the variance of a series. We investigate under which conditions taking logs is beneficial for forecasting. Forecasts based on the original series are compared to forecasts based on logs. For a range of economic variables, substantial forecasting improvements from taking logs are found if the log transformation actually stabilizes the variance of the underlying series. Using logs can be damaging for the forecast precision if a stable variance is not achieved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this paper is to apply the mis-specification (M-S) encompassing perspective to the problem of choosing between linear and log-linear unit-root models. A simple M-S encompassing test, based on an auxiliary regression stemming from the conditional second moment, is proposed and its empirical size and power are investigated using Monte Carlo simulations. It is shown that by focusing on the conditional process the sampling distributions of the relevant statistics are well behaved under both the null and alternative hypotheses. The proposed M-S encompassing test is illustrated using US total disposable income quarterly data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this paper is to develop a Bayesian approach for log-Birnbaum-Saunders Student-t regression models under right-censored survival data. Markov chain Monte Carlo (MCMC) methods are used to develop a Bayesian procedure for the considered model. In order to attenuate the influence of the outlying observations on the parameter estimates, we present in this paper Birnbaum-Saunders models in which a Student-t distribution is assumed to explain the cumulative damage. Also, some discussions on the model selection to compare the fitted models are given and case deletion influence diagnostics are developed for the joint posterior distribution based on the Kullback-Leibler divergence. The developed procedures are illustrated with a real data set. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In interval-censored survival data, the event of interest is not observed exactly but is only known to occur within some time interval. Such data appear very frequently. In this paper, we are concerned only with parametric forms, and so a location-scale regression model based on the exponentiated Weibull distribution is proposed for modeling interval-censored data. We show that the proposed log-exponentiated Weibull regression model for interval-censored data represents a parametric family of models that include other regression models that are broadly used in lifetime data analysis. Assuming the use of interval-censored data, we employ a frequentist analysis, a jackknife estimator, a parametric bootstrap and a Bayesian analysis for the parameters of the proposed model. We derive the appropriate matrices for assessing local influences on the parameter estimates under different perturbation schemes and present some ways to assess global influences. Furthermore, for different parameter settings, sample sizes and censoring percentages, various simulations are performed; in addition, the empirical distribution of some modified residuals are displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended to a modified deviance residual in log-exponentiated Weibull regression models for interval-censored data. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the generalized log-gamma regression model is modified to allow the possibility that long-term survivors may be present in the data. This modification leads to a generalized log-gamma regression model with a cure rate, encompassing, as special cases, the log-exponential, log-Weibull and log-normal regression models with a cure rate typically used to model such data. The models attempt to simultaneously estimate the effects of explanatory variables on the timing acceleration/deceleration of a given event and the surviving fraction, that is, the proportion of the population for which the event never occurs. The normal curvatures of local influence are derived under some usual perturbation schemes and two martingale-type residuals are proposed to assess departures from the generalized log-gamma error assumption as well as to detect outlying observations. Finally, a data set from the medical area is analyzed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In survival analysis applications, the failure rate function may frequently present a unimodal shape. In such case, the log-normal or log-logistic distributions are used. In this paper, we shall be concerned only with parametric forms, so a location-scale regression model based on the Burr XII distribution is proposed for modeling data with a unimodal failure rate function as an alternative to the log-logistic regression model. Assuming censored data, we consider a classic analysis, a Bayesian analysis and a jackknife estimator for the parameters of the proposed model. For different parameter settings, sample sizes and censoring percentages, various simulation studies are performed and compared to the performance of the log-logistic and log-Burr XII regression models. Besides, we use sensitivity analysis to detect influential or outlying observations, and residual analysis is used to check the assumptions in the model. Finally, we analyze a real data set under log-Buff XII regression models. (C) 2008 Published by Elsevier B.V.