904 resultados para Generalized linear mixed models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two hypotheses for how conditions for larval mosquitoes affect vectorial capacity make opposite predictions about the relationship of adult size and frequency of infection with vector-borne pathogens. Competition among larvae produces small adult females. The competition-susceptibility hypothesis postulates that small females are more susceptible to infection and predicts frequency of infection should decrease with size. The competition-longevity hypothesis postulates that small females have lower longevity and lower probability of becoming competent to transmit the pathogen and thus predicts frequency of infection should increase with size. We tested these hypotheses for Aedes aegypti in Rio de Janeiro, Brazil, during a dengue outbreak. In the laboratory, longevity increases with size, then decreases at the largest sizes. For field-collected females, generalised linear mixed model comparisons showed that a model with a linear increase of frequency of dengue with size produced the best Akaike’s information criterion with a correction for small sample sizes (AICc). Consensus prediction of three competing models indicated that frequency of infection increases monotonically with female size, consistent with the competition-longevity hypothesis. Site frequency of infection was not significantly related to site mean size of females. Thus, our data indicate that uncrowded, low competition conditions for larvae produce the females that are most likely to be important vectors of dengue. More generally, ecological conditions, particularly crowding and intraspecific competition among larvae, are likely to affect vector-borne pathogen transmission in nature, in this case via effects on longevity of resulting adults. Heterogeneity among individual vectors in likelihood of infection is a generally important outcome of ecological conditions impacting vectors as larvae.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To evaluate whether the correlation between in vitro bond strength data and estimated clinical retention rates of cervical restorations after two years depends on pooled data obtained from multicenter studies or single-test data. Materials and Methods: Pooled mean data for six dentin adhesive systems (Adper Prompt L-Pop, Clearfil SE, OptiBond FL, Prime & Bond NT, Single Bond, and Scotchbond Multipurpose) and four laboratory methods (macroshear, microshear, macrotensile and microtensile bond strength test) (Scherrer et al, 2010) were correlated to estimated pooled two-year retention rates of Class V restorations using the same adhesive systems. For bond strength data from a single test institute, the literature search in SCOPUS revealed one study that tested all six adhesive systems (microtensile) and two that tested five of the six systems (microtensile, macroshear). The correlation was determined with a database designed to perform a meta-analysis on the clinical performance of cervical restorations (Heintze et al, 2010). The clinical data were pooled and adjusted in a linear mixed model, taking the study effect, dentin preparation, type of isolation and bevelling of enamel into account. A regression analysis was carried out to evaluate the correlation between clinical and laboratory findings. Results: The results of the regression analysis for the pooled data revealed that only the macrotensile (adjusted R2 = 0.86) and microtensile tests (adjusted R2 = 0.64), but not the shear and the microshear tests, correlated well with the clinical findings. As regards the data from a single-test institute, the correlation was not statistically significant. Conclusion: Macrotensile and microtensile bond strength tests showed an adequate correlation with the retention rate of cervical restorations after two years. Bond strength tests should be carried out by different operators and/or research institutes to determine the reliability and technique sensitivity of the material under investigation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present models predicting the potential distribution of a threatened ant species, Formica exsecta Nyl., in the Swiss National Park ( SNP). Data to fit the models have been collected according to a random-stratified design with an equal number of replicates per stratum. The basic aim of such a sampling strategy is to allow the formal testing of biological hypotheses about those factors most likely to account for the distribution of the modeled species. The stratifying factors used in this study were: vegetation, slope angle and slope aspect, the latter two being used as surrogates of solar radiation, considered one of the basic requirements of F. exsecta. Results show that, although the basic stratifying predictors account for more than 50% of the deviance, the incorporation of additional non-spatially explicit predictors into the model, as measured in the field, allows for an increased model performance (up to nearly 75%). However, this was not corroborated by permutation tests. Implementation on a national scale was made for one model only, due to the difficulty of obtaining similar predictors on this scale. The resulting map on the national scale suggests that the species might once have had a broader distribution in Switzerland. Reasons for its particular abundance within the SNP might possibly be related to habitat fragmentation and vegetation transformation outside the SNP boundaries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abiotic factors such as climate and soil determine the species fundamental niche, which is further constrained by biotic interactions such as interspecific competition. To parameterize this realized niche, species distribution models (SDMs) most often relate species occurrence data to abiotic variables, but few SDM studies include biotic predictors to help explain species distributions. Therefore, most predictions of species distributions under future climates assume implicitly that biotic interactions remain constant or exert only minor influence on large-scale spatial distributions, which is also largely expected for species with high competitive ability. We examined the extent to which variance explained by SDMs can be attributed to abiotic or biotic predictors and how this depends on species traits. We fit generalized linear models for 11 common tree species in Switzerland using three different sets of predictor variables: biotic, abiotic, and the combination of both sets. We used variance partitioning to estimate the proportion of the variance explained by biotic and abiotic predictors, jointly and independently. Inclusion of biotic predictors improved the SDMs substantially. The joint contribution of biotic and abiotic predictors to explained deviance was relatively small (similar to 9%) compared to the contribution of each predictor set individually (similar to 20% each), indicating that the additional information on the realized niche brought by adding other species as predictors was largely independent of the abiotic (topo-climatic) predictors. The influence of biotic predictors was relatively high for species preferably growing under low disturbance and low abiotic stress, species with long seed dispersal distances, species with high shade tolerance as juveniles and adults, and species that occur frequently and are dominant across the landscape. The influence of biotic variables on SDM performance indicates that community composition and other local biotic factors or abiotic processes not included in the abiotic predictors strongly influence prediction of species distributions. Improved prediction of species' potential distributions in future climates and communities may assist strategies for sustainable forest management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract OBJECTIVE To identify the factors associated with involuntary hospital admissions of technology-dependent children, in the municipality of Ribeirão Preto, São Paulo State, Brazil. METHOD A cross-sectional study, with a quantitative approach. After an active search, 124 children who qualified under the inclusion criteria, that is to say, children from birth to age 12, were identified. Data was collected in home visits to mothers or the people responsible for the children, through the application of a questionnaire. Analysis of the data followed the assumptions of the Generalized Linear Models technique. RESULTS 102 technology-dependent children aged between 6 months and 12 years participated in the study, of whom 57% were male. The average number of involuntary hospital admissions in the previous year among the children studied was 0.71 (±1.29). In the final model the following variables were significantly associated with the outcome: age (OR=0.991; CI95%=0.985-0.997), and the number of devices (OR=0.387; CI95%=0.219-0.684), which were characterized as factors of protection and quantity of medications (OR=1.532; CI95%=1.297-1.810), representing a risk factor for involuntary hospital admissions in technology-dependent children. CONCLUSION The results constitute input data for consideration of the process of care for technology-dependent children by supplying an explanatory model for involuntary hospital admissions for this client group.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The effect of environment on development and survival of pupae of the necrophagous fly Ophyra albuquerquei Lopes (Diptera, Muscidae). Species of Ophyra Robineau-Desvoidy, 1830 are found in decomposing bodies, usually in fresh, bloated and decay stages. Ophyra albuquerquei Lopes, for example, can be found in animal carcasses. The influence of environmental factors has not been evaluated in puparia of O. albuquerquei. Thus, the focus of this work was motivated by the need for models to predict the development of a necrophagous insect as a function of abiotic factors. Colonies of O. albuquerquei were maintained in the laboratory to obtain pupae. On the tenth day of each month 200 pupae, divided equally into 10 glass jars, were exposed to the environment and checked daily for adult emergence of each sample. We concluded that the high survival rate observed suggested that the diets used for rearing the larvae and maintaining the adults were appropriate. Also, the data adjusted to robust generalized linear models and there were no interruptions of O. albuquerquei pupae development within the limits of temperatures studied in southern Rio Grande do Sul, given the high survival presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context There are no evidence syntheses available to guide clinicians on when to titrate antihypertensive medication after initiation. Objective To model the blood pressure (BP) response after initiating antihypertensive medication. Data sources electronic databases including Medline, Embase, Cochrane Register and reference lists up to December 2009. Study selection Trials that initiated antihypertensive medication as single therapy in hypertensive patients who were either drug naive or had a placebo washout from previous drugs. Data extraction Office BP measurements at a minimum of two weekly intervals for a minimum of 4 weeks. An asymptotic approach model of BP response was assumed and non-linear mixed effects modelling used to calculate model parameters. Results and conclusions Eighteen trials that recruited 4168 patients met inclusion criteria. The time to reach 50% of the maximum estimated BP lowering effect was 1 week (systolic 0.91 weeks, 95% CI 0.74 to 1.10; diastolic 0.95, 0.75 to 1.15). Models incorporating drug class as a source of variability did not improve fit of the data. Incorporating the presence of a titration schedule improved model fit for both systolic and diastolic pressure. Titration increased both the predicted maximum effect and the time taken to reach 50% of the maximum (systolic 1.2 vs 0.7 weeks; diastolic 1.4 vs 0.7 weeks). Conclusions Estimates of the maximum efficacy of antihypertensive agents can be made early after starting therapy. This knowledge will guide clinicians in deciding when a newly started antihypertensive agent is likely to be effective or not at controlling BP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The northern Humboldt Current system (NHCS) off Peru is one of the most productive world marine regions. It represents less than 0.1% of the world ocean surface but presently sustains about 10% of the world fish catch, with the Peruvian anchovy or anchoveta Engraulis ringens as emblematic fish resource. Compared with other eastern boundary upwelling systems, the higher fish productivity of the NHCS cannot be explained by a corresponding higher primary productivity. On another hand, the NHCS is the region where El Niño, and climate variability in general, is most notable. Also, surface oxygenated waters overlie an intense and extremely shallow Oxygen Minimum Zone (OMZ). In this context, the main objective of this study is to better understand the trophic flows in the NHCS using both stomach content and stable isotope analyses. The study focuses on a variety of organisms from low trophic levels such as zooplankton to top predators (seabirds and fur seals). The approach combines both long-term and specific studies on emblematic species such as anchoveta, and sardine Sardinops sagax and a more inclusive analysis considering the 'global' food web in the recent years (2008 – 2012) using stable isotope analysis. Revisiting anchovy and sardine we show that whereas phytoplankton largely dominated anchoveta and sardine diets in terms of numerical abundance, the carbon content of prey items indicated that zooplankton was by far the most important dietary component. Indeed for anchovy euphausiids contributed 67.5% of dietary carbon, followed by copepods (26.3%). Selecting the largest prey, the euphausiids, provide an energetic advantage for anchoveta in its ecosystem where oxygen depletion imposes strong metabolic constrain to pelagic fish. Sardine feed on smaller zooplankton than do anchoveta, with sardine diet consisting of smaller copepods and fewer euphausiids than anchoveta diet. Hence, trophic competition between sardine and anchovy in the northern Humboldt Current system is minimized by their partitioning of the zooplankton food resource based on prey size, as has been reported in other systems. These results suggest an ecological role for pelagic fish that challenges previous understanding of their position in the foodweb (zooplanktophagous instead of phytophagous), the functioning and the trophic models of the NHCS. Finally to obtain a more comprehensive vision of the relative trophic position of NHCS main components we used stable isotope analyses. For that purpose we analyzed the δ13C and δ15N stable isotope values of thirteen taxonomic categories collected off Peru from 2008 - 2011, i.e., zooplankton, fish, squids and air-breathing top predators. The δ15N isotope signature was strongly impacted by the species, the body length and the latitude. Along the Peruvian coast, the OMZ get more intense and shallow south of ~7.5ºS impacting the baseline nitrogen stable isotopes. Employing a linear mixed-effects modelling approach taking into account the latitudinal and body length effects, we provide a new vision of the relative trophic position of key ecosystem components. Also we confirm stomach content-based results on anchoveta Engraulis ringens and highlight the potential remarkable importance of an often neglected ecosystem component, the squat lobster Pleuroncodes monodon. Indeed, our results support the hypothesis according to which this species forage to some extent on fish eggs and larvae and can thus predate on the first life stages of exploited species. However, the δ13C values of these two species suggest that anchoveta and squat lobster do not exactly share the same habitat. This would potentially reduce some direct competition and/or predation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En este documento se ilustra de un modo práctico, el empleo de tres instrumentos que permiten al actuario definir grupos arancelarios y estimar premios de riesgo en el proceso que tasa la clase para el seguro de no vida. El primero es el análisis de segmentación (CHAID y XAID) usado en primer lugar en 1997 por UNESPA en su cartera común de coches. El segundo es un proceso de selección gradual con el modelo de regresión a base de distancia. Y el tercero es un proceso con el modelo conocido y generalizado de regresión linear, que representa la técnica más moderna en la bibliografía actuarial. De estos últimos, si combinamos funciones de eslabón diferentes y distribuciones de error, podemos obtener el aditivo clásico y modelos multiplicativos

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistical models allow the representation of data sets and the estimation and/or prediction of the behavior of a given variable through its interaction with the other variables involved in a phenomenon. Among other different statistical models, are the autoregressive state-space models (ARSS) and the linear regression models (LR), which allow the quantification of the relationships among soil-plant-atmosphere system variables. To compare the quality of the ARSS and LR models for the modeling of the relationships between soybean yield and soil physical properties, Akaike's Information Criterion, which provides a coefficient for the selection of the best model, was used in this study. The data sets were sampled in a Rhodic Acrudox soil, along a spatial transect with 84 points spaced 3 m apart. At each sampling point, soybean samples were collected for yield quantification. At the same site, soil penetration resistance was also measured and soil samples were collected to measure soil bulk density in the 0-0.10 m and 0.10-0.20 m layers. Results showed autocorrelation and a cross correlation structure of soybean yield and soil penetration resistance data. Soil bulk density data, however, were only autocorrelated in the 0-0.10 m layer and not cross correlated with soybean yield. The results showed the higher efficiency of the autoregressive space-state models in relation to the equivalent simple and multiple linear regression models using Akaike's Information Criterion. The resulting values were comparatively lower than the values obtained by the regression models, for all combinations of explanatory variables.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Is it possible to build predictive models (PMs) of soil particle-size distribution (psd) in a region with complex geology and a young and unstable land-surface? The main objective of this study was to answer this question. A set of 339 soil samples from a small slope catchment in Southern Brazil was used to build PMs of psd in the surface soil layer. Multiple linear regression models were constructed using terrain attributes (elevation, slope, catchment area, convergence index, and topographic wetness index). The PMs explained more than half of the data variance. This performance is similar to (or even better than) that of the conventional soil mapping approach. For some size fractions, the PM performance can reach 70 %. Largest uncertainties were observed in geologically more complex areas. Therefore, significant improvements in the predictions can only be achieved if accurate geological data is made available. Meanwhile, PMs built on terrain attributes are efficient in predicting the particle-size distribution (psd) of soils in regions of complex geology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims: To assess the potential distribution of an obligate seeder and active pyrophyte, Cistus salviifolius, a vulnerable species in the Swiss Red List; to derive scenarios by changing the fire return interval; and to discuss the results from a conservation perspective. A more general aim is to assess the impact of fire as a natural factor influencing the vegetation of the southern slopes of the Alps. Locations: Alps, southern Switzerland. Methods: Presence-absence data to fit the model were obtained from the most recent field mapping of C. salviifolius. The quantitative environmental predictors used in this study include topographic, climatic and disturbance (fire) predictors. Models were fitted by logistic regression and evaluated by jackknife and bootstrap approaches. Changes in fire regime were simulated by increasing the time-return interval of fire (simulating longer periods without fire). Two scenarios were considered: no fire in the past 15 years; or in the past 35 years. Results: Rock cover, slope, topographic position, potential evapotranspiration and time elapsed since the last fire were selected in the final model. The Nagelkerke R-2 of the model for C. salviifolius was 0.57 and the Jackknife area under the curve evaluation was 0.89. The bootstrap evaluation revealed model robustness. By increasing the return interval of fire by either up to 15 years, or 35 years, the modelled C. salviifolius population declined by 30-40%, respectively. Main conclusions: Although fire plays a significant role, topography and rock cover appear to be the most important predictors, suggesting that the distribution of C. salviifolius in the southern Swiss Alps is closely related to the availability of supposedly competition-free sites, such as emerging bedrock, ridge locations or steep slopes. Fire is more likely to play a secondary role in allowing C. salviifolius to extend its occurrence temporarily, by increasing germination rates and reducing the competition from surrounding vegetation. To maintain a viable dormant seed bank for C. salviifolius, conservation managers should consider carrying out vegetation clearing and managing wild fire propagation to reduce competition and ensure sufficient recruitment for this species.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ground-state instability to bond alternation in long linear chains is considered from the point of view of valence-bond (VB) theory. This instability is viewed as the consequence of a long-range order (LRO) which is expected if the ground state is reasonably described in terms of the Kekulé states (with nearest-neighbor singlet pairing). It is argued that the bond alternation and associated LRO predicted by this simple, VB picture is retained for certain linear Heisenberg models; many-body VB calculations on spin s=1 / 2 and s=1 chains are carried out in a test of this argument.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En este documento se ilustra de un modo práctico, el empleo de tres instrumentos que permiten al actuario definir grupos arancelarios y estimar premios de riesgo en el proceso que tasa la clase para el seguro de no vida. El primero es el análisis de segmentación (CHAID y XAID) usado en primer lugar en 1997 por UNESPA en su cartera común de coches. El segundo es un proceso de selección gradual con el modelo de regresión a base de distancia. Y el tercero es un proceso con el modelo conocido y generalizado de regresión linear, que representa la técnica más moderna en la bibliografía actuarial. De estos últimos, si combinamos funciones de eslabón diferentes y distribuciones de error, podemos obtener el aditivo clásico y modelos multiplicativos

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: MLPA method is a potentially useful semi-quantitative method to detect copy number alterations in targeted regions. In this paper, we propose a method for the normalization procedure based on a non-linear mixed-model, as well as a new approach for determining the statistical significance of altered probes based on linear mixed-model. This method establishes a threshold by using different tolerance intervals that accommodates the specific random error variability observed in each test sample.Results: Through simulation studies we have shown that our proposed method outperforms two existing methods that are based on simple threshold rules or iterative regression. We have illustrated the method using a controlled MLPA assay in which targeted regions are variable in copy number in individuals suffering from different disorders such as Prader-Willi, DiGeorge or Autism showing the best performace.Conclusion: Using the proposed mixed-model, we are able to determine thresholds to decide whether a region is altered. These threholds are specific for each individual, incorporating experimental variability, resulting in improved sensitivity and specificity as the examples with real data have revealed.