952 resultados para Generalised Additive Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Considerable evidence from twin and adoption studies indicates that genetic and shared environmental factors play a significant role in the initiation of smoking behavior. Although twin and adoption designs are powerful to detect genetic and environmental influences, they do not provide information on the processes of assortative mating and parent–offspring transmission and their contribution to the variability explained by genetic and/or environmental factors. Methods We examined the role of genetic and environmental factors for smoking initiation using an extended kinship design. This design allows the simultaneous testing of additive and non-additive genetic, shared and individual-specific environmental factors, as well as sex differences in the expression of genes and environment in the presence of assortative mating and combined genetic and cultural transmission. A dichotomous lifetime smoking measure was obtained from twins and relatives in the Virginia 30,000 sample. Results Results demonstrate that both genetic and environmental factors play a significant role in the liability to smoking initiation. Major influences on individual differences appeared to be additive genetic and unique environmental effects, with smaller contributions from assortative mating, shared sibling environment, twin environment, cultural transmission and resulting genotype–environment covariance. The finding of negative cultural transmission without dominance led us to investigate more closely two possible mechanisms for the lower parent–offspring correlations compared to the sibling and DZ twin correlations in subsets of the data: (i) age × gene interaction, and (ii) social homogamy. Neither mechanism provided a significantly better explanation of the data, although age regression was significant. Conclusions This study showed significant heritability, partly due to assortment, and significant effects of primarily non-parental shared environment on smoking initiation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a forecasting technique for forward electricity/gas prices, one day ahead. This technique combines a Kalman filter (KF) and a generalised autoregressive conditional heteroschedasticity (GARCH) model (often used in financial forecasting). The GARCH model is used to compute next value of a time series. The KF updates parameters of the GARCH model when the new observation is available. This technique is applied to real data from the UK energy markets to evaluate its performance. The results show that the forecasting accuracy is improved significantly by using this hybrid model. The methodology can be also applied to forecasting market clearing prices and electricity/gas loads.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Letter addresses image segmentation via a generative model approach. A Bayesian network (BNT) in the space of dyadic wavelet transform coefficients is introduced to model texture images. The model is similar to a Hidden Markov model (HMM), but with non-stationary transitive conditional probability distributions. It is composed of discrete hidden variables and observable Gaussian outputs for wavelet coefficients. In particular, the Gabor wavelet transform is considered. The introduced model is compared with the simplest joint Gaussian probabilistic model for Gabor wavelet coefficients for several textures from the Brodatz album [1]. The comparison is based on cross-validation and includes probabilistic model ensembles instead of single models. In addition, the robustness of the models to cope with additive Gaussian noise is investigated. We further study the feasibility of the introduced generative model for image segmentation in the novelty detection framework [2]. Two examples are considered: (i) sea surface pollution detection from intensity images and (ii) image segmentation of the still images with varying illumination across the scene.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The generalised transportation problem (GTP) is an extension of the linear Hitchcock transportation problem. However, it does not have the unimodularity property, which means the linear programming solution (like the simplex method) cannot guarantee to be integer. This is a major difference between the GTP and the Hitchcock transportation problem. Although some special algorithms, such as the generalised stepping-stone method, have been developed, but they are based on the linear programming model and the integer solution requirement of the GTP is relaxed. This paper proposes a genetic algorithm (GA) to solve the GTP and a numerical example is presented to show the algorithm and its efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We studied the visual mechanisms that serve to encode spatial contrast at threshold and supra-threshold levels. In a 2AFC contrast-discrimination task, observers had to detect the presence of a vertical 1 cycle deg-1 test grating (of contrast dc) that was superimposed on a similar vertical 1 cycle deg-1 pedestal grating, whereas in pattern masking the test grating was accompanied by a very different masking grating (horizontal 1 cycle deg-1, or oblique 3 cycles deg-1). When expressed as threshold contrast (dc at 75% correct) versus mask contrast (c) our results confirm previous ones in showing a characteristic 'dipper function' for contrast discrimination but a smoothly increasing threshold for pattern masking. However, fresh insight is gained by analysing and modelling performance (p; percent correct) as a joint function of (c, dc) - the performance surface. In contrast discrimination, psychometric functions (p versus logdc) are markedly less steep when c is above threshold, but in pattern masking this reduction of slope did not occur. We explored a standard gain-control model with six free parameters. Three parameters control the contrast response of the detection mechanism and one parameter weights the mask contrast in the cross-channel suppression effect. We assume that signal-detection performance (d') is limited by additive noise of constant variance. Noise level and lapse rate are also fitted parameters of the model. We show that this model accounts very accurately for the whole performance surface in both types of masking, and thus explains the threshold functions and the pattern of variation in psychometric slopes. The cross-channel weight is about 0.20. The model shows that the mechanism response to contrast increment (dc) is linearised by the presence of pedestal contrasts but remains nonlinear in pattern masking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Performance evaluation in conventional data envelopment analysis (DEA) requires crisp numerical values. However, the observed values of the input and output data in real-world problems are often imprecise or vague. These imprecise and vague data can be represented by linguistic terms characterised by fuzzy numbers in DEA to reflect the decision-makers' intuition and subjective judgements. This paper extends the conventional DEA models to a fuzzy framework by proposing a new fuzzy additive DEA model for evaluating the efficiency of a set of decision-making units (DMUs) with fuzzy inputs and outputs. The contribution of this paper is threefold: (1) we consider ambiguous, uncertain and imprecise input and output data in DEA, (2) we propose a new fuzzy additive DEA model derived from the a-level approach and (3) we demonstrate the practical aspects of our model with two numerical examples and show its comparability with five different fuzzy DEA methods in the literature. Copyright © 2011 Inderscience Enterprises Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Heterogeneous and incomplete datasets are common in many real-world visualisation applications. The probabilistic nature of the Generative Topographic Mapping (GTM), which was originally developed for complete continuous data, can be extended to model heterogeneous (i.e. containing both continuous and discrete values) and missing data. This paper describes and assesses the resulting model on both synthetic and real-world heterogeneous data with missing values.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AMS Subj. Classification: 83C15, 83C35

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper considers the global synchronisation of a stochastic version of coupled map lattices networks through an innovative stochastic adaptive linear quadratic pinning control methodology. In a stochastic network, each state receives only noisy measurement of its neighbours' states. For such networks we derive a generalised Riccati solution that quantifies and incorporates uncertainty of the forward dynamics and inverse controller in the derivation of the stochastic optimal control law. The generalised Riccati solution is derived using the Lyapunov approach. A probabilistic approximation type algorithm is employed to estimate the conditional distributions of the state and inverse controller from historical data and quantifying model uncertainties. The theoretical derivation is complemented by its validation on a set of representative examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most machine-learning algorithms are designed for datasets with features of a single type whereas very little attention has been given to datasets with mixed-type features. We recently proposed a model to handle mixed types with a probabilistic latent variable formalism. This proposed model describes the data by type-specific distributions that are conditionally independent given the latent space and is called generalised generative topographic mapping (GGTM). It has often been observed that visualisations of high-dimensional datasets can be poor in the presence of noisy features. In this paper we therefore propose to extend the GGTM to estimate feature saliency values (GGTMFS) as an integrated part of the parameter learning process with an expectation-maximisation (EM) algorithm. The efficacy of the proposed GGTMFS model is demonstrated both for synthetic and real datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The integrability of the nonlinear Schräodinger equation (NLSE) by the inverse scattering transform shown in a seminal work [1] gave an interesting opportunity to treat the corresponding nonlinear channel similar to a linear one by using the nonlinear Fourier transform. Integrability of the NLSE is in the background of the old idea of eigenvalue communications [2] that was resurrected in recent works [3{7]. In [6, 7] the new method for the coherent optical transmission employing the continuous nonlinear spectral data | nonlinear inverse synthesis was introduced. It assumes the modulation and detection of data using directly the continuous part of nonlinear spectrum associated with an integrable transmission channel (the NLSE in the case considered). Although such a transmission method is inherently free from nonlinear impairments, the noisy signal corruptions, arising due to the ampli¯er spontaneous emission, inevitably degrade the optical system performance. We study properties of the noise-corrupted channel model in the nonlinear spectral domain attributed to NLSE. We derive the general stochastic equations governing the signal evolution inside the nonlinear spectral domain and elucidate the properties of the emerging nonlinear spectral noise using well-established methods of perturbation theory based on inverse scattering transform [8]. It is shown that in the presence of small noise the communication channel in the nonlinear domain is the additive Gaussian channel with memory and signal-dependent correlation matrix. We demonstrate that the effective spectral noise acquires colouring", its autocorrelation function becomes slow decaying and non-diagonal as a function of \frequencies", and the noise loses its circular symmetry, becoming elliptically polarized. Then we derive a low bound for the spectral effiency for such a channel. Our main result is that by using the nonlinear spectral techniques one can significantly increase the achievable spectral effiency compared to the currently available methods [9]. REFERENCES 1. Zakharov, V. E. and A. B. Shabat, Sov. Phys. JETP, Vol. 34, 62{69, 1972. 2. Hasegawa, A. and T. Nyu, J. Lightwave Technol., Vol. 11, 395{399, 1993. 3. Yousefi, M. I. and F. R. Kschischang, IEEE Trans. Inf. Theory, Vol. 60, 4312{4328, 2014. 4. Yousefi, M. I. and F. R. Kschischang, IEEE Trans. Inf. Theory, Vol. 60, 4329{4345 2014. 5. Yousefi, M. I. and F. R. Kschischang, IEEE Trans. Inf. Theory, Vol. 60, 4346{4369, 2014. 6. Prilepsky, J. E., S. A. Derevyanko, K. J. Blow, I. Gabitov, and S. K. Turitsyn, Phys. Rev. Lett., Vol. 113, 013901, 2014. 7. Le, S. T., J. E. Prilepsky, and S. K. Turitsyn, Opt. Express, Vol. 22, 26720{26741, 2014. 8. Kaup, D. J. and A. C. Newell, Proc. R. Soc. Lond. A, Vol. 361, 413{446, 1978. 9. Essiambre, R.-J., G. Kramer, P. J. Winzer, G. J. Foschini, and B. Goebel, J. Lightwave Technol., Vol. 28, 662{701, 2010.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Setting out from the database of Operophtera brumata, L. in between 1973 and 2000 due to the Light Trap Network in Hungary, we introduce a simple theta-logistic population dynamical model based on endogenous and exogenous factors, only. We create an indicator set from which we can choose some elements with which we can improve the fitting results the most effectively. Than we extend the basic simple model with additive climatic factors. The parameter optimization is based on the minimized root mean square error. The best model is chosen according to the Akaike Information Criterion. Finally we run the calibrated extended model with daily outputs of the regional climate model RegCM3.1, regarding 1961-1990 as reference period and 2021-2050 with 2071-2100 as future predictions. The results of the three time intervals are fitted with Beta distributions and compared statistically. The expected changes are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper reviews some additive and multiplicative properties of ranking procedures used for generalized tournaments with missing values and multiple comparisons. The methods analysed are the score, generalised row sum and least squares as well as fair bets and its variants. It is argued that generalised row sum should be applied not with a fixed parameter, but a variable one proportional to the number of known comparisons. It is shown that a natural additive property has strong links to independence of irrelevant matches, an axiom judged unfavourable when players have different opponents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One in five adults 65 years and older has diabetes. Coping with diabetes is a lifelong task, and much of the responsibility for managing the disease falls upon the individual. Reports of non-adherence to recommended treatments are high. Understanding the additive impact of diabetes on quality of life issues is important. The purpose of this study was to investigate the quality of life and diabetes self-management behaviors in ethnically diverse older adults with type 2 diabetes. The SF-12v2 was used to measure physical and mental health quality of life. Scores were compared to general, age sub-groups, and diabetes-specific norms. The Transtheoretical Model (TTM) was applied to assess perceived versus actual behavior for three diabetes self-management tasks: dietary management, medication management, and blood glucose self-monitoring. Dietary intake and hemoglobin A1c values were measured as outcome variables. Utilizing a cross-sectional research design, participants were recruited from Elderly Nutrition Program congregate meal sites (n = 148, mean age 75). ^ Results showed that mean scores of the SF-12v2 were significantly lower in the study sample than the general norms for physical health (p < .001), mental health (p < .01), age sub-group norms (p < .05), and diabetes-specific norms for physical health (p < .001). A multiple regression analysis found that adherence to an exercise plan was significantly associated with better physical health (p < .001). Transtheoretical Model multiple regression analyses explained 68% of the variance for % Kcal from fat, 41% for fiber, 70% for % Kcal from carbohydrate, and 7% for hemoglobin A 1c values. Significant associations were found between TTM stage of change and dietary fiber intake (p < .01). Other significant associations related to diet included gender (p < .01), ethnicity (p < .05), employment (p < .05), type of insurance (p < .05), adherence to an exercise plan (p < .05), number of doctor visits/year ( p < .01), and physical health (p < .05). Significant associations were found between hemoglobin A1c values and age ( p < .05), being non-Hispanic Black (p < .01), income (p < .01), and eye problems (p < .05). ^ The study highlights the importance of the beneficial effects of exercise on quality of life issues. Furthermore, application of the Transtheoretical Model in conjunction with an assessment of dietary intake may be valuable in helping individuals make lifestyle changes. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report is a review of additive and subtractive manufacturing techniques. This approach (additive manufacturing) has resided largely in the prototyping realm, where the methods of producing complex freeform solid objects directly from a computer model without part-specific tooling or knowledge. But these technologies are evolving steadily and are beginning to encompass related systems of material addition, subtraction, assembly, and insertion of components made by other processes. Furthermore, these various additive processes are starting to evolve into rapid manufacturing techniques for mass-customized products, away from narrowly defined rapid prototyping. Taking this idea far enough down the line, and several years hence, a radical restructuring of manufacturing could take place. Manufacturing itself would move from a resource base to a knowledge base and from mass production of single use products to mass customized, high value, life cycle products, majority of research and development was focused on advanced development of existing technologies by improving processing performance, materials, modelling and simulation tools, and design tools to enable the transition from prototyping to manufacturing of end use parts.