953 resultados para Hierarchical Linear Modelling
Resumo:
Overrecentdecades,remotesensinghasemergedasaneffectivetoolforimprov- ing agriculture productivity. In particular, many works have dealt with the problem of identifying characteristics or phenomena of crops and orchards on different scales using remote sensed images. Since the natural processes are scale dependent and most of them are hierarchically structured, the determination of optimal study scales is mandatory in understanding these processes and their interactions. The concept of multi-scale/multi- resolution inherent to OBIA methodologies allows the scale problem to be dealt with. But for that multi-scale and hierarchical segmentation algorithms are required. The question that remains unsolved is to determine the suitable scale segmentation that allows different objects and phenomena to be characterized in a single image. In this work, an adaptation of the Simple Linear Iterative Clustering (SLIC) algorithm to perform a multi-scale hierarchi- cal segmentation of satellite images is proposed. The selection of the optimal multi-scale segmentation for different regions of the image is carried out by evaluating the intra- variability and inter-heterogeneity of the regions obtained on each scale with respect to the parent-regions defined by the coarsest scale. To achieve this goal, an objective function, that combines weighted variance and the global Moran index, has been used. Two different kinds of experiment have been carried out, generating the number of regions on each scale through linear and dyadic approaches. This methodology has allowed, on the one hand, the detection of objects on different scales and, on the other hand, to represent them all in a sin- gle image. Altogether, the procedure provides the user with a better comprehension of the land cover, the objects on it and the phenomena occurring.
Resumo:
Event extraction from texts aims to detect structured information such as what has happened, to whom, where and when. Event extraction and visualization are typically considered as two different tasks. In this paper, we propose a novel approach based on probabilistic modelling to jointly extract and visualize events from tweets where both tasks benefit from each other. We model each event as a joint distribution over named entities, a date, a location and event-related keywords. Moreover, both tweets and event instances are associated with coordinates in the visualization space. The manifold assumption that the intrinsic geometry of tweets is a low-rank, non-linear manifold within the high-dimensional space is incorporated into the learning framework using a regularization. Experimental results show that the proposed approach can effectively deal with both event extraction and visualization and performs remarkably better than both the state-of-the-art event extraction method and a pipeline approach for event extraction and visualization.
Resumo:
Understanding how virus strains offer protection against closely related emerging strains is vital for creating effective vaccines. For many viruses, including Foot-and-Mouth Disease Virus (FMDV) and the Influenza virus where multiple serotypes often co-circulate, in vitro testing of large numbers of vaccines can be infeasible. Therefore the development of an in silico predictor of cross-protection between strains is important to help optimise vaccine choice. Vaccines will offer cross-protection against closely related strains, but not against those that are antigenically distinct. To be able to predict cross-protection we must understand the antigenic variability within a virus serotype, distinct lineages of a virus, and identify the antigenic residues and evolutionary changes that cause the variability. In this thesis we present a family of sparse hierarchical Bayesian models for detecting relevant antigenic sites in virus evolution (SABRE), as well as an extended version of the method, the extended SABRE (eSABRE) method, which better takes into account the data collection process. The SABRE methods are a family of sparse Bayesian hierarchical models that use spike and slab priors to identify sites in the viral protein which are important for the neutralisation of the virus. In this thesis we demonstrate how the SABRE methods can be used to identify antigenic residues within different serotypes and show how the SABRE method outperforms established methods, mixed-effects models based on forward variable selection or l1 regularisation, on both synthetic and viral datasets. In addition we also test a number of different versions of the SABRE method, compare conjugate and semi-conjugate prior specifications and an alternative to the spike and slab prior; the binary mask model. We also propose novel proposal mechanisms for the Markov chain Monte Carlo (MCMC) simulations, which improve mixing and convergence over that of the established component-wise Gibbs sampler. The SABRE method is then applied to datasets from FMDV and the Influenza virus in order to identify a number of known antigenic residue and to provide hypotheses of other potentially antigenic residues. We also demonstrate how the SABRE methods can be used to create accurate predictions of the important evolutionary changes of the FMDV serotypes. In this thesis we provide an extended version of the SABRE method, the eSABRE method, based on a latent variable model. The eSABRE method takes further into account the structure of the datasets for FMDV and the Influenza virus through the latent variable model and gives an improvement in the modelling of the error. We show how the eSABRE method outperforms the SABRE methods in simulation studies and propose a new information criterion for selecting the random effects factors that should be included in the eSABRE method; block integrated Widely Applicable Information Criterion (biWAIC). We demonstrate how biWAIC performs equally to two other methods for selecting the random effects factors and combine it with the eSABRE method to apply it to two large Influenza datasets. Inference in these large datasets is computationally infeasible with the SABRE methods, but as a result of the improved structure of the likelihood, we are able to show how the eSABRE method offers a computational improvement, leading it to be used on these datasets. The results of the eSABRE method show that we can use the method in a fully automatic manner to identify a large number of antigenic residues on a variety of the antigenic sites of two Influenza serotypes, as well as making predictions of a number of nearby sites that may also be antigenic and are worthy of further experiment investigation.
Resumo:
The main goal of this paper is to expose and validate a methodology to design efficient automatic controllers for irrigation canals, based on the Saint-Venant model. This model-based methodology enables to design controllers at the design stage (when the canal is not already built). The methodology is applied on an experimental canal located in Portugal. First the full nonlinear PDE model is calibrated, using a single steady-state experiment. The model is then linearized around a functioning point, in order to design linear PI controllers. Two classical control strategies are tested (local upstream control and distant downstream control) and compared on the canal. The experimental results show the effectiveness of the model.
Resumo:
Nowadays, one of the most ambitious challenges in soft robotics is the development of actuators capable to achieve performance comparable to skeletal muscles. Scientists have been working for decades, inspired by Nature, to mimic both their complex structure and their perfectly balanced features in terms of linear contraction, force-to-weight ratio, scalability and flexibility. The present Thesis, contextualized within the FET open Horizon 2020 project MAGNIFY, aims to develop a new family of innovative flexible actuators in the field of soft-robotics. For the realization of this actuator, a biomimetic approach has been chosen, drawing inspiration from skeletal muscle. Their hierarchical fibrous structure was mimicked employing the electrospinning technique, while the contraction of sarcomeres was designed employing chains of molecular machines, supramolecular systems capable of performing movements useful to execute specific tasks. The first part deals with the design and production of the basic unit of the artificial muscle, the artificial myofibril, consisting in a novel electrospun core-shell nanofiber, with elastomeric shell and electrically conductive core, coupled with a conductive coating, for the realization of which numerous strategies have been investigated. The second part deals instead with the integration of molecular machines (provided by the project partners) inside these artificial myofibrils, preceded by the study of several model molecules, aimed at simulating the presence of these molecular machines during the initial phases of the project. The last part concerns the realization of an electrospun multiscale hierarchical structure, aimed at reproducing the entire muscle morphology and fibrous organization. These research will be joined together in the near future like the pieces of a puzzle, recreating the artificial actuator most similar to biological muscle ever made, composed of millions of artificial myofibrils, electrically activated in which the nano-scale movement of molecular machines will be incrementally amplified to the macro-scale contraction of the artificial muscle.
Resumo:
This thesis explores the methods based on the free energy principle and active inference for modelling cognition. Active inference is an emerging framework for designing intelligent agents where psychological processes are cast in terms of Bayesian inference. Here, I appeal to it to test the design of a set of cognitive architectures, via simulation. These architectures are defined in terms of generative models where an agent executes a task under the assumption that all cognitive processes aspire to the same objective: the minimization of variational free energy. Chapter 1 introduces the free energy principle and its assumptions about self-organizing systems. Chapter 2 describes how from the mechanics of self-organization can emerge a minimal form of cognition able to achieve autopoiesis. In chapter 3 I present the method of how I formalize generative models for action and perception. The architectures proposed allow providing a more biologically plausible account of more complex cognitive processing that entails deep temporal features. I then present three simulation studies that aim to show different aspects of cognition, their associated behavior and the underlying neural dynamics. In chapter 4, the first study proposes an architecture that represents the visuomotor system for the encoding of actions during action observation, understanding and imitation. In chapter 5, the generative model is extended and is lesioned to simulate brain damage and neuropsychological patterns observed in apraxic patients. In chapter 6, the third study proposes an architecture for cognitive control and the modulation of attention for action selection. At last, I argue how active inference can provide a formal account of information processing in the brain and how the adaptive capabilities of the simulated agents are a mere consequence of the architecture of the generative models. Cognitive processing, then, becomes an emergent property of the minimization of variational free energy.
Resumo:
A three-dimensional Direct Finite Element procedure is here presented which takes into account most of the factors affecting the interaction problem of the dam-water-foundation system, whilst keeping the computational cost at a reasonable level by introducing some simplified hypotheses. A truncated domain is defined, and the dynamic behaviour of the system is treated as a wave-scattering problem where the presence of the dam perturbs an original free-field system. The rock foundation truncated boundaries are enclosed by a set of free-field one-dimensional and two-dimensional systems which transmit the effective forces to the main model and apply adsorbing viscous boundaries to ensure radiation damping. The water domain is treated as an added mass moving with the dam. A strategy is proposed to keep the viscous dampers at the boundaries unloaded during the initial phases of analysis, when the static loads are initialised, and thus avoid spurious displacements. A focus is given to the nonlinear behaviour of the rock foundation, with concentrated plasticity along the natural discontinuities of the rock mass, immersed in an otherwise linear elastic medium with Rayleigh damping. The entire procedure is implemented in the commercial software Abaqus®, whose base code is enriched with specific user subroutines when needed. All the extra coding is attached to the Thesis and tested against analytical results and simple examples. Possible rock wedge instabilities induced by intense ground motion, which are not easily investigated within a comprehensive model of the dam-water-foundation system, are treated separately with a simplified decoupled dynamic approach derived from the classical Newmark method, integrated with FE calculation of dam thrust on the wedges during the earthquake. Both the described approaches are applied to the case study of the Ridracoli arch-gravity dam (Italy) in order to investigate its seismic response to the Maximum Credible Earthquake (MCE) in a full reservoir condition.
Resumo:
In this thesis, new classes of models for multivariate linear regression defined by finite mixtures of seemingly unrelated contaminated normal regression models and seemingly unrelated contaminated normal cluster-weighted models are illustrated. The main difference between such families is that the covariates are treated as fixed in the former class of models and as random in the latter. Thus, in cluster-weighted models the assignment of the data points to the unknown groups of observations depends also by the covariates. These classes provide an extension to mixture-based regression analysis for modelling multivariate and correlated responses in the presence of mild outliers that allows to specify a different vector of regressors for the prediction of each response. Expectation-conditional maximisation algorithms for the calculation of the maximum likelihood estimate of the model parameters have been derived. As the number of free parameters incresases quadratically with the number of responses and the covariates, analyses based on the proposed models can become unfeasible in practical applications. These problems have been overcome by introducing constraints on the elements of the covariance matrices according to an approach based on the eigen-decomposition of the covariance matrices. The performances of the new models have been studied by simulations and using real datasets in comparison with other models. In order to gain additional flexibility, mixtures of seemingly unrelated contaminated normal regressions models have also been specified so as to allow mixing proportions to be expressed as functions of concomitant covariates. An illustration of the new models with concomitant variables and a study on housing tension in the municipalities of the Emilia-Romagna region based on different types of multivariate linear regression models have been performed.
Resumo:
In this work, we explore and demonstrate the potential for modeling and classification using quantile-based distributions, which are random variables defined by their quantile function. In the first part we formalize a least squares estimation framework for the class of linear quantile functions, leading to unbiased and asymptotically normal estimators. Among the distributions with a linear quantile function, we focus on the flattened generalized logistic distribution (fgld), which offers a wide range of distributional shapes. A novel naïve-Bayes classifier is proposed that utilizes the fgld estimated via least squares, and through simulations and applications, we demonstrate its competitiveness against state-of-the-art alternatives. In the second part we consider the Bayesian estimation of quantile-based distributions. We introduce a factor model with independent latent variables, which are distributed according to the fgld. Similar to the independent factor analysis model, this approach accommodates flexible factor distributions while using fewer parameters. The model is presented within a Bayesian framework, an MCMC algorithm for its estimation is developed, and its effectiveness is illustrated with data coming from the European Social Survey. The third part focuses on depth functions, which extend the concept of quantiles to multivariate data by imposing a center-outward ordering in the multivariate space. We investigate the recently introduced integrated rank-weighted (IRW) depth function, which is based on the distribution of random spherical projections of the multivariate data. This depth function proves to be computationally efficient and to increase its flexibility we propose different methods to explicitly model the projected univariate distributions. Its usefulness is shown in classification tasks: the maximum depth classifier based on the IRW depth is proven to be asymptotically optimal under certain conditions, and classifiers based on the IRW depth are shown to perform well in simulated and real data experiments.
Resumo:
This study investigated the effect of simulated microwave disinfection (SMD) on the linear dimensional changes, hardness and impact strength of acrylic resins under different polymerization cycles. Metal dies with referential points were embedded in flasks with dental stone. Samples of Classico and Vipi acrylic resins were made following the manufacturers' recommendations. The assessed polymerization cycles were: A-- water bath at 74ºC for 9 h; B-- water bath at 74ºC for 8 h and temperature increased to 100ºC for 1 h; C-- water bath at 74ºC for 2 h and temperature increased to 100ºC for 1 h;; and D-- water bath at 120ºC and pressure of 60 pounds. Linear dimensional distances in length and width were measured after SMD and water storage at 37ºC for 7 and 30 days using an optical microscope. SMD was carried out with the samples immersed in 150 mL of water in an oven (650 W for 3 min). A load of 25 gf for 10 sec was used in the hardness test. Charpy impact test was performed with 40 kpcm. Data were submitted to ANOVA and Tukey's test (5%). The Classico resin was dimensionally steady in length in the A and D cycles for all periods, while the Vipi resin was steady in the A, B and C cycles for all periods. The Classico resin was dimensionally steady in width in the C and D cycles for all periods, and the Vipi resin was steady in all cycles and periods. The hardness values for Classico resin were steady in all cycles and periods, while the Vipi resin was steady only in the C cycle for all periods. Impact strength values for Classico resin were steady in the A, C and D cycles for all periods, while Vipi resin was steady in all cycles and periods. SMD promoted different effects on the linear dimensional changes, hardness and impact strength of acrylic resins submitted to different polymerization cycles when after SMD and water storage were considered.
Resumo:
This study investigated the effect of simulated microwave disinfection (SMD) on the linear dimensional changes, hardness and impact strength of acrylic resins under different polymerization cycles. Metal dies with referential points were embedded in flasks with dental stone. Samples of Classico and Vipi acrylic resins were made following the manufacturers' recommendations. The assessed polymerization cycles were: A) water bath at 74 ºC for 9 h; B) water bath at 74 ºC for 8 h and temperature increased to 100 ºC for 1 h; C) water bath at 74 ºC for 2 h and temperature increased to 100 ºC for 1 h; and D) water bath at 120 ºC and pressure of 60 pounds. Linear dimensional distances in length and width were measured after SMD and water storage at 37 ºC for 7 and 30 days using an optical microscope. SMD was carried out with the samples immersed in 150 mL of water in an oven (650 W for 3 min). A load of 25 gf for 10 s was used in the hardness test. Charpy impact test was performed with 40 kpcm. Data were submitted to ANOVA and Tukey's test (5%). The Classico resin was dimensionally steady in length in the A and D cycles for all periods, while the Vipi resin was steady in the A, B and C cycles for all periods. The Classico resin was dimensionally steady in width in the C and D cycles for all periods, and the Vipi resin was steady in all cycles and periods. The hardness values for Classico resin were steady in all cycles and periods, while the Vipi resin was steady only in the C cycle for all periods. Impact strength values for Classico resin were steady in the A, C and D cycles for all periods, while Vipi resin was steady in all cycles and periods. SMD promoted different effects on the linear dimensional changes, hardness and impact strength of acrylic resins submitted to different polymerization cycles when after SMD and water storage were considered.
Resumo:
In acquired immunodeficiency syndrome (AIDS) studies it is quite common to observe viral load measurements collected irregularly over time. Moreover, these measurements can be subjected to some upper and/or lower detection limits depending on the quantification assays. A complication arises when these continuous repeated measures have a heavy-tailed behavior. For such data structures, we propose a robust structure for a censored linear model based on the multivariate Student's t-distribution. To compensate for the autocorrelation existing among irregularly observed measures, a damped exponential correlation structure is employed. An efficient expectation maximization type algorithm is developed for computing the maximum likelihood estimates, obtaining as a by-product the standard errors of the fixed effects and the log-likelihood function. The proposed algorithm uses closed-form expressions at the E-step that rely on formulas for the mean and variance of a truncated multivariate Student's t-distribution. The methodology is illustrated through an application to an Human Immunodeficiency Virus-AIDS (HIV-AIDS) study and several simulation studies.
Resumo:
The basic reproduction number is a key parameter in mathematical modelling of transmissible diseases. From the stability analysis of the disease free equilibrium, by applying Routh-Hurwitz criteria, a threshold is obtained, which is called the basic reproduction number. However, the application of spectral radius theory on the next generation matrix provides a different expression for the basic reproduction number, that is, the square root of the previously found formula. If the spectral radius of the next generation matrix is defined as the geometric mean of partial reproduction numbers, however the product of these partial numbers is the basic reproduction number, then both methods provide the same expression. In order to show this statement, dengue transmission modelling incorporating or not the transovarian transmission is considered as a case study. Also tuberculosis transmission and sexually transmitted infection modellings are taken as further examples.
Resumo:
This study examined the influence of three polymerization cycles (1: heat cure - long cycle; 2: heat cure - short cycle; and 3: microwave activation) on the linear dimensions of three denture base resins, immediately after deflasking, and 30 days after storage in distilled water at 37± 2ºC. The acrylic resins used were: Clássico, Lucitone 550 and Acron MC. The first two resins were submitted to all three polymerization cycles, and the Acron MC resin was cured by microwave activation only. The samples had three marks, and dimensions of 65 mm in length, 10 mm in width and 3 mm in thickness. Twenty-one test specimens were fabricated for each combination of resin and cure cycle, and they were submitted to three linear dimensional evaluations for two positions (A and B). The changes were evaluated using a microscope. The results indicated that all acrylic resins, regardless of the cure cycle, showed increased linear dimension after 30 days of storage in water. The composition of the acrylic resin affected the results more than the cure cycles, and the conventional acrylic resin (Lucitone 550 and Clássico) cured by microwave activation presented similar results when compared with the resin specific for microwave activation.
Resumo:
BACKGROUND: Changes in heart rate during rest-exercise transition can be characterized by the application of mathematical calculations, such as deltas 0-10 and 0-30 seconds to infer on the parasympathetic nervous system and linear regression and delta applied to data range from 60 to 240 seconds to infer on the sympathetic nervous system. The objective of this study was to test the hypothesis that young and middle-aged subjects have different heart rate responses in exercise of moderate and intense intensity, with different mathematical calculations. METHODS: Seven middle-aged men and ten young men apparently healthy were subject to constant load tests (intense and moderate) in cycle ergometer. The heart rate data were submitted to analysis of deltas (0-10, 0-30 and 60-240 seconds) and simple linear regression (60-240 seconds). The parameters obtained from simple linear regression analysis were: intercept and slope angle. We used the Shapiro-Wilk test to check the distribution of data and the t test for unpaired comparisons between groups. The level of statistical significance was 5%. RESULTS: The value of the intercept and delta 0-10 seconds was lower in middle age in two loads tested and the inclination angle was lower in moderate exercise in middle age. CONCLUSION: The young subjects present greater magnitude of vagal withdrawal in the initial stage of the HR response during constant load exercise and higher speed of adjustment of sympathetic response in moderate exercise.