933 resultados para Bayesian hierarchical linear model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background. Similar to parent support in the home environment, teacher support at school may positively influence children's fruit and vegetable (FV) consumption. This study assessed the relationship between teacher support for FV consumption and the FV intake of 4th and 5th grade students in low-income elementary schools in central Texas. Methods. A secondary analysis was performed on baseline data collected from 496 parent-child dyads during the Marathon Kids study carried out by the Michael & Susan Dell Center for Healthy Living at the University of Texas School of Public Health. A hierarchical linear regression analysis adjusting for key demographic variables, parent support, and home FV availability was conducted. In addition, separate linear regression models stratified by quartiles of home FV availability were conducted to assess the relationship between teacher support and FV intake by level of home FV availability. Results. Teacher support was not significantly related to students' FV intake (p = .44). However, the interaction of teacher support and home FV availability was positively associated with students' FV consumption (p < .05). For students in the lowest quartile of home FV availability, teacher support accounted for approximately 6% of the FV intake variance (p = .02). For higher levels of FV availability, teacher support and FV intake were not related. Conclusions. For lower income elementary school-aged children with low FV availability at home, greater teacher support may lead to modest increases in FV consumption.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The infant mortality rate (IMR) is considered to be one of the most important indices of a country's well-being. Countries around the world and other health organizations like the World Health Organization are dedicating their resources, knowledge and energy to reduce the infant mortality rates. The well-known Millennium Development Goal 4 (MDG 4), whose aim is to archive a two thirds reduction of the under-five mortality rate between 1990 and 2015, is an example of the commitment. ^ In this study our goal is to model the trends of IMR between the 1950s to 2010s for selected countries. We would like to know how the IMR is changing overtime and how it differs across countries. ^ IMR data collected over time forms a time series. The repeated observations of IMR time series are not statistically independent. So in modeling the trend of IMR, it is necessary to account for these correlations. We proposed to use the generalized least squares method in general linear models setting to deal with the variance-covariance structure in our model. In order to estimate the variance-covariance matrix, we referred to the time-series models, especially the autoregressive and moving average models. Furthermore, we will compared results from general linear model with correlation structure to that from ordinary least squares method without taking into account the correlation structure to check how significantly the estimates change.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cardiovascular disease (CVD) is a threat to public health. It has been reported to be the leading cause of death in United States. The invention of next generation sequencing (NGS) technology has revolutionized the biomedical research. To investigate NGS data of CVD related quantitative traits would contribute to address the unknown etiology and disease mechanism of CVD. NHLBI's Exome Sequencing Project (ESP) contains CVD related phenotypes and their associated NGS exomes sequence data. Initially, a subset of next generation sequencing data consisting of 13 CVD-related quantitative traits was investigated. Only 6 traits, systolic blood pressure (SBP), diastolic blood pressure (DBP), height, platelet counts, waist circumference, and weight, were analyzed by functional linear model (FLM) and 7 currently existing methods. FLM outperformed all currently existing methods by identifying the highest number of significant genes and had identified 96, 139, 756, 1162, 1106, and 298 genes associated with SBP, DBP, Height, Platelet, Waist, and Weight respectively. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantitative real-time polymerase chain reaction (qPCR) is a sensitive gene quantitation method that has been widely used in the biological and biomedical fields. The currently used methods for PCR data analysis, including the threshold cycle (CT) method, linear and non-linear model fitting methods, all require subtracting background fluorescence. However, the removal of background fluorescence is usually inaccurate, and therefore can distort results. Here, we propose a new method, the taking-difference linear regression method, to overcome this limitation. Briefly, for each two consecutive PCR cycles, we subtracted the fluorescence in the former cycle from that in the later cycle, transforming the n cycle raw data into n-1 cycle data. Then linear regression was applied to the natural logarithm of the transformed data. Finally, amplification efficiencies and the initial DNA molecular numbers were calculated for each PCR run. To evaluate this new method, we compared it in terms of accuracy and precision with the original linear regression method with three background corrections, being the mean of cycles 1-3, the mean of cycles 3-7, and the minimum. Three criteria, including threshold identification, max R2, and max slope, were employed to search for target data points. Considering that PCR data are time series data, we also applied linear mixed models. Collectively, when the threshold identification criterion was applied and when the linear mixed model was adopted, the taking-difference linear regression method was superior as it gave an accurate estimation of initial DNA amount and a reasonable estimation of PCR amplification efficiencies. When the criteria of max R2 and max slope were used, the original linear regression method gave an accurate estimation of initial DNA amount. Overall, the taking-difference linear regression method avoids the error in subtracting an unknown background and thus it is theoretically more accurate and reliable. This method is easy to perform and the taking-difference strategy can be extended to all current methods for qPCR data analysis.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Up to now, snow cover on Antarctic sea ice and its impact on radar backscatter, particularly after the onset of freeze/thaw processes, are not well understood. Here we present a combined analysis of in situ observations of snow properties from the landfast sea ice in Atka Bay, Antarctica, and high-resolution TerraSAR-X backscatter data, for the transition from austral spring (November 2012) to summer (January 2013). The physical changes in the seasonal snow cover during that time are reflected in the evolution of TerraSAR-X backscatter. We are able to explain 76-93% of the spatio-temporal variability of the TerraSAR-X backscatter signal with up to four snowpack parameters with a root-mean-squared error of 0.87-1.62 dB, using a simple multiple linear model. Over the complete study, and especially after the onset of early-melt processes and freeze/thaw cycles, the majority of variability in the backscatter is influenced by changes in snow/ice interface temperature, snow depth and top-layer grain size. This suggests it may be possible to retrieve snow physical properties over Antarctic sea ice from X-band SAR backscatter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Phycobiliproteins are a family of water-soluble pigment proteins that play an important role as accessory or antenna pigments and absorb in the green part of the light spectrum poorly used by chlorophyll a. The phycoerythrins (PEs) are one of four types of phycobiliproteins that are generally distinguished based on their absorption properties. As PEs are water soluble, they are generally not captured with conventional pigment analysis. Here we present a statistical model based on in situ measurements of three transatlantic cruises which allows us to derive relative PE concentration from standardized hyperspectral underwater radiance measurements (Lu). The model relies on Empirical Orthogonal Function (EOF) analysis of Lu spectra and, subsequently, a Generalized Linear Model with measured PE concentrations as the response variable and EOF loadings as predictor variables. The method is used to predict relative PE concentrations throughout the water column and to calculate integrated PE estimates based on those profiles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ocean acidification can have negative repercussions from the organism to ecosystem levels. Octocorals deposit high-magnesium calcite in their skeletons, and according to different models, they could be more susceptible to the depletion of carbonate ions than either calcite or aragonite-depositing organisms. This study investigated the response of the gorgonian coral Eunicea fusca to a range of CO2 concentrations from 285 to 4,568 ppm (pH range 8.1-7.1) over a 4-week period. Gorgonian growth and calcification were measured at each level of CO2 as linear extension rate and percent change in buoyant weight and calcein incorporation in individual sclerites, respectively. There was a significant negative relationship for calcification and CO2 concentration that was well explained by a linear model regression analysis for both buoyant weight and calcein staining. In general, growth and calcification did not stop in any of the concentrations of pCO2; however, some of the octocoral fragments experienced negative calcification at undersaturated levels of calcium carbonate (>4,500 ppm) suggesting possible dissolution effects. These results highlight the susceptibility of the gorgonian coral E. fusca to elevated levels of carbon dioxide but suggest that E. fusca could still survive well in mid-term ocean acidification conditions expected by the end of this century, which provides important information on the effects of ocean acidification on the dynamics of coral reef communities. Gorgonian corals can be expected to diversify and thrive in the Atlantic-Eastern Pacific; as scleractinian corals decline, it is likely to expect a shift in these reef communities from scleractinian coral dominated to octocoral/soft coral dominated under a "business as usual" scenario of CO2 emissions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este estudio aborda la recopilación de nuevas tendencias del diseño sismorresistente, enfocándose en la técnica del aislamiento de base, por ser la más efectiva, difundida y utilizada; y el análisis de las ventajas que puede tener una edificación que aplica dicha técnica, desde el punto de vista estructural y económico. Se elige la tipología más frecuente o común de edificios de hormigón armado propensos a ser aislados, que en este caso es un hospital, cuyo modelo empotrado se somete a varias normas sismorresistentes comparando principalmente fuerzas de cortante basal, y considerando la interacción suelo-estructura; para asistir a este cálculo se desarrolla un programa de elementos viga de 6 gdl por nodo en código Matlab. El modelo aislado incluye el análisis de tres combinaciones de tipos de aisladores HDR, LPR y FPS, alternando modelos lineales simplificados de 1 y 3 gdl por piso, evaluando diferencias de respuestas de la estructura, y procediendo a la elección de la combinación que de resultados más convenientes; para la modelación no lineal de cada sistema de aislamiento se utiliza el método explícito de diferencias centrales. Finalmente, se realiza un análisis comparativo de daños esperados en el caso de la ocurrencia del sismo de diseño, utilizando el método rápido y tomando como referencia el desplazamiento espectral del último piso; llegando a dar conclusiones y recomendaciones para el uso de sistemas de aislamiento. This study addresses the collection of new seismic design trends, focusing on base isolation technique, as the most effective and widely used, and the analysis of the advantages in buildings that apply this technique, from the structurally and economically point of view. Choosing the most common types of concrete buildings likely to be isolated, which in this case is a hospital, the fix model is subjected to various seismic codes mainly comparing base shear forces, and considering the soil-structure interaction; for this calculation attend a program of bars 6 dof per node is made in Matlab code. The isolated model includes analysis of three types of isolators combinations HDR, LPR and FPS, alternating simplified linear model of 1 and 3 dof per floor, evaluating differences in the response of the structure, and proceeding to the choice of the combination of results more convenient; for modeling nonlinear each insulation system, the explicit central difference method is used. Finally, a comparative analysis of expected damage in the case of the design earthquake, using a fast combined method and by reference to the spectral displacement of the top floor; reaching conclusions and give recommendations for the use of insulation systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Natural regeneration in stone pine (Pinus pinea L.) managed forests in the Spanish Northern Plateau is not achieved successfully under current silviculture practices, constituting a main concern for forest managers. We modelled spatio-temporal features of primary dispersal to test whether (a) present low stand densities constrain natural regeneration success and (b) seed release is a climate-controlled process. The present study is based on data collected from a 6 years seed trap experiment considering different regeneration felling intensities. From a spatial perspective, we attempted alternate established kernels under different data distribution assumptions to fit a spatial model able to predict P. pinea seed rain. Due to P. pinea umbrella-like crown, models were adapted to account for crown effect through correction of distances between potential seed arrival locations and seed sources. In addition, individual tree fecundity was assessed independently from existing models, improving parameter estimation stability. Seed rain simulation enabled to calculate seed dispersal indexes for diverse silvicultural regeneration treatments. The selected spatial model of best fit (Weibull, Poisson assumption) predicted a highly clumped dispersal pattern that resulted in a proportion of gaps where no seed arrival is expected (dispersal limitation) between 0.25 and 0.30 for intermediate intensity regeneration fellings and over 0.50 for intense fellings. To describe the temporal pattern, the proportion of seeds released during monthly intervals was modelled as a function of climate variables – rainfall events – through a linear model that considered temporal autocorrelation, whereas cone opening took place over a temperature threshold. Our findings suggest the application of less intensive regeneration fellings, to be carried out after years of successful seedling establishment and, seasonally, subsequent to the main rainfall period (late fall). This schedule would avoid dispersal limitation and would allow for a complete seed release. These modifications in present silviculture practices would produce a more efficient seed shadow in managed stands.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Solar radiation is the most important source of renewable energy in the planet; it's important to solar engineers, designers and architects, and it's also fundamental for efficiently determining irrigation water needs and potential yield of crops, among others. Complete and accurate solar radiation data at a specific region are indispensable. For locations where measured values are not available, several models have been developed to estimate solar radiation. The objective of this paper was to calibrate, validate and compare five representative models to predict global solar radiation, adjusting the empirical coefficients to increase the local applicability and to develop a linear model. All models were based on easily available meteorological variables, without sunshine hours as input, and were used to estimate the daily solar radiation at Cañada de Luque (Córdoba, Argentina). As validation, measured and estimated solar radiation data were analyzed using several statistic coefficients. The results showed that all the analyzed models were robust and accurate (R2 and RMSE values between 0.87 to 0.89 and 2.05 to 2.14, respectively), so global radiation can be estimated properly with easily available meteorological variables when only temperature data are available. Hargreaves-Samani, Allen and Bristow-Campbell models could be used with typical values to estimate solar radiation while Samani and Almorox models should be applied with calibrated coefficients. Although a new linear model presented the smallest R2 value (R2 = 0.87), it could be considered useful for its easy application. The daily global solar radiation values produced for these models can be used to estimate missing daily values, when only temperature data are available, and in hydrologic or agricultural applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a biomolecular probabilistic model driven by the action of a DNA toolbox made of a set of DNA templates and enzymes that is able to perform Bayesian inference. The model will take single-stranded DNA as input data, representing the presence or absence of a specific molecular signal (the evidence). The program logic uses different DNA templates and their relative concentration ratios to encode the prior probability of a disease and the conditional probability of a signal given the disease. When the input and program molecules interact, an enzyme-driven cascade of reactions (DNA polymerase extension, nicking and degradation) is triggered, producing a different pair of single-stranded DNA species. Once the system reaches equilibrium, the ratio between the output species will represent the application of Bayes? law: the conditional probability of the disease given the signal. In other words, a qualitative diagnosis plus a quantitative degree of belief in that diagno- sis. Thanks to the inherent amplification capability of this DNA toolbox, the resulting system will be able to to scale up (with longer cascades and thus more input signals) a Bayesian biosensor that we designed previously.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, multiple regression analysis is used to model the top of descent (TOD) location of user-preferred descent trajectories computed by the flight management system (FMS) on over 1000 commercial flights into Melbourne, Australia. In addition to recording TOD, the cruise altitude, final altitude, cruise Mach, descent speed, wind, and engine type were also identified for use as the independent variables in the regression analysis. Both first-order and second-order models are considered, where cross-validation, hypothesis testing, and additional analysis are used to compare models. This identifies the models that should give the smallest errors if used to predict TOD location for new data in the future. A model that is linear in TOD altitude, final altitude, descent speed, and wind gives an estimated standard deviation of 3.9 nmi for TOD location given the trajectory parame- ters, which means about 80% of predictions would have error less than 5 nmi in absolute value. This accuracy is better than demonstrated by other ground automation predictions using kinetic models. Furthermore, this approach would enable online learning of the model. Additional data or further knowledge of algorithms is necessary to conclude definitively that no second-order terms are appropriate. Possible applications of the linear model are described, including enabling arriving aircraft to fly optimized descents computed by the FMS even in congested airspace.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El WCTR es un congreso de reconocido prestigio internacional en el ámbito de la investigación del transporte y aunque las actas publicadas están en formato digital y sin ISSN ni ISBN, lo consideramos lo suficientemente importante como para que se considere en los indicadores. This paper aims at describing how multilateral cooperation policies are influencing national transport policies in developing countries. It considers the evolution of national transport policies and institutional frameworks in Algeria, Morocco and Tunisia in the last 10 years, and analyses the influence that EU cooperation programmes (particularly those within the Euromed programme initiative) and international coordination activities have played in the evolution towards efficient, sustainable transport systems in those countries. Notwithstanding the significant socioeconomic, political and institutional differences among the three countries, three major traits are common to the transport policy framework in all cases: a focus on megaprojects; substitution of traditional ministerial services by ad hoc public agencies to develop those megaprojects, and progressive involvement of international private players for the operation (and eventually the design and construction) of new projects, focusing on know-how transfer rather than investment needs. The hypotheses is that these similarities are largely due to the influence of the international cooperation promoted by the European Union since the mid- 1990s. The new decision making situation is characterized by the involvement of two new relevant stakeholders, the EU and a limited number of global transport operators. The hierarchical governance model evolves towards more complex structures, which explain the three common traits mentioned above. International coordination has been crucial for developing national transport visions, which are coherent with a regional, transnational system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La Ingeniería del Software Empírico (ISE) utiliza como herramientas los estudios empíricos para conseguir evidencias que ayuden a conocer bajo qué circunstancias es mejor usar una tecnología software en lugar de otra. La investigación en la que se enmarca este TFM explora si las intuiciones y/o preferencias de las personas que realizan las pruebas de software, son capaces de predecir la efectividad de tres técnicas de evaluación de código: lectura por abstracciones sucesivas, cobertura de decisión y partición en clases de equivalencia. Para conseguir dicho objetivo, se analizan los datos recogidos en un estudio empírico, realizado por las tutoras de este TFM. En el estudio empírico distintos sujetos aplican las tres técnicas de evaluación de código a tres programas distintos, a los que se les habían introducido una serie de faltas artificialmente. Los sujetos deben reportar los fallos encontrados en los programas, así como, contestar a una serie de preguntas sobre sus intuiciones y preferencias. A la hora de analizar los datos del estudio, se ha comprobado: 1) cuáles son sus intuiciones y preferencias (mediante el test estadístico X2 de Pearson); 2) si los sujetos cambian de opinión después de aplicar las técnicas (para ello se ha utilizado índice de Kappa, el Test de McNemar-Bowker y el Test de Stuart-Maxwell); 3) la consistencia de las distintas preguntas (mediante el índice de Kappa), comparando: intuiciones con intuiciones, preferencias con preferencias e intuiciones con preferencias; 4) Por último, si hay coincidencia entre las intuiciones y preferencias con la efectividad real obtenida (para ello se ha utilizado, el Modelo Lineal General con medidas repetidas). Los resultados muestran que, no hay una intuición clara ni tampoco una preferencia concreta, con respecto a los programas. Además aunque existen cambios de opinión después de aplicar las técnicas, no se encuentran evidencias claras para afirmar que la intuición y preferencias influyen en su efectividad. Finalmente, existen relaciones entre las intuiciones con intuiciones, preferencias con preferencias e intuiciones con preferencias, además esta relación es más notoria después de aplicar las técnicas. ----ABSTRACT----Empirical Software Engineering (ESE) uses empirical studies as a mean to generate evidences to help determine under what circumstances it is convenient to use a given software technology. This Master Thesis is part of a research that explores whether intuitions and/or preferences of testers, can be used to predict the effectiveness of three code evaluation techniques: reading by stepwise abstractions, decision coverage and equivalence partitioning. To achieve this goal, this Master Thesis analyzes the data collected in an empirical study run by the tutors. In the empirical study, different subjects apply three code evaluation techniques to three different programs. A series of faults were artificially introduced to the programs. Subjects are required to report the defects found in the programs, as well as answer a series of questions about their intuitions and preferences. The data analyses test: 1) what are the intuitions and preferences of the subjects (using the Pearson X2 test); 2) whether subjects change their minds after applying the techniques (using the Kappa coefficient, McNemar-Bowker test, and Stuart-Maxwell test); 3) the consistency of the different questions, comparing: intuitions versus intuitions, preferences versus preferences and preferences versus intuitions (using the Kappa coefficient); 4) finally, if intuitions and/or preferences predict the actual effectiveness obtained (using the General Linear Model, repeated measures). The results show that there is not clear intuition or particular preference with respect to the programs. Moreover, although there are changes of mind after applying the techniques, there are not clear evidences to claim that intuition and preferences influence their effectiveness. Finally, there is a relationship between the intuitions versus intuitions, preferences versus preferences and intuitions versus preferences; this relationship is more noticeable after applying the techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El MC en baloncesto es aquel fenómeno relacionado con el juego que presenta unas características particulares determinadas por la idiosincrasia de un equipo y puede afectar a los protagonistas y por ende al devenir del juego. En la presente Tesis se ha estudiado la incidencia del MC en Liga A.C.B. de baloncesto y para su desarrollo en profundidad se ha planteado dos investigaciones una cuantitativa y otra cualitativa cuya metodología se detalla a continuación: La investigación cuantitativa se ha basado en la técnica de estudio del “Performance analysis”, para ello se han estudiado cuatro temporadas de la Liga A.C.B. (del 2007/08 al 2010/11), tal y como refleja en la bibliografía consultada se han tomado como momentos críticos del juego a los últimos cinco minutos de partidos donde la diferencia de puntos fue de seis puntos y todos los Tiempos Extras disputados, de tal manera que se han estudiado 197 momentos críticos. La contextualización del estudio se ha hecho en función de la variables situacionales “game location” (local o visitante), “team quality” (mejores o peores clasificados) y “competition” (fases de LR y Playoff). Para la interpretación de los resultados se han realizado los siguientes análisis descriptivos: 1) Análisis Discriminante, 2) Regresión Lineal Múltiple; y 3) Análisis del Modelo Lineal General Multivariante. La investigación cualitativa se ha basado en la técnica de investigación de la entrevista semiestructurada. Se entrevistaron a 12 entrenadores que militaban en la Liga A.C.B. durante la temporada 2011/12, cuyo objetivo ha sido conocer el punto de vista que tiene el entrenador sobre el concepto del MC y que de esta forma pudiera dar un enfoque más práctico basado en su conocimiento y experiencia acerca de cómo actuar ante el MC en el baloncesto. Los resultados de ambas investigaciones coinciden en señalar la importancia del MC sobre el resultado final del juego. De igual forma, el concepto en sí entraña una gran complejidad por lo que se considera fundamental la visión científica de la observación del juego y la percepción subjetiva que presenta el entrenador ante el fenómeno, para la cual los aspectos psicológicos de sus protagonistas (jugadores y entrenadores) son determinantes. ABSTRACT The Critical Moment (CM) in basketball is a related phenomenon with the game that has particular features determined by the idiosyncrasies of a team and can affect the players and therefore the future of the game. In this Thesis we have studied the impact of CM in the A.C.B. League and from a profound development two investigations have been raised, quantitative and qualitative whose methodology is as follows: The quantitative research is based on the technique of study "Performance analysis", for this we have studied four seasons in the A.C.B. League (2007/08 to 2010/11), and as reflected in the literature the Critical Moments of the games were taken from the last five minutes of games where the point spread was six points and all overtimes disputed, such that 197 critical moments have been studied. The contextualization of the study has been based on the situational variables "game location" (home or away), "team quality" (better or lower classified) and "competition" (LR and Playoff phases). For the interpretation of the results the following descriptive analyzes were performed: 1) Discriminant Analysis, 2) Multiple Linear Regression Analysis; and 3) Analysis of Multivariate General Linear Model. Qualitative research is based on the technique of investigation of a semi-structured interview. 12 coaches who belonged to the A.C.B. League were interviewed in seasons 2011/12, which aimed to determine the point of view that the coach has on the CM concept and thus could give a more practical approach based on their knowledge and experience about how to deal with the CM in basketball. The results of both studies agree on the importance of the CM on the final outcome of the game. Similarly, the concept itself is highly complex so the scientific view of the observation of the game is considered essential as well as the subjective perception the coach presents before the phenomenon, for which the psychological aspects of their characters (players and coaches) are crucial.