864 resultados para MEAN-VARIANCE CONTROL
Resumo:
Foot-and-mouth disease (FMD) is a highly contagious disease that caused several large outbreaks in Europe in the last century. The last important outbreak in Switzerland took place in 1965/66 and affected more than 900 premises and more than 50,000 animals were slaughtered. Large-scale emergency vaccination of the cattle and pig population has been applied to control the epidemic. In recent years, many studies have used infectious disease models to assess the impact of different disease control measures, including models developed for diseases exotic for the specific region of interest. Often, the absence of real outbreak data makes a validation of such models impossible. This study aimed to evaluate whether a spatial, stochastic simulation model (the Davis Animal Disease Simulation model) can predict the course of a Swiss FMD epidemic based on the available historic input data on population structure, contact rates, epidemiology of the virus, and quality of the vaccine. In addition, the potential outcome of the 1965/66 FMD epidemic without application of vaccination was investigated. Comparing the model outcomes to reality, only the largest 10% of the simulated outbreaks approximated the number of animals being culled. However, the simulation model highly overestimated the number of culled premises. While the outbreak duration could not be well reproduced by the model compared to the 1965/66 epidemic, it was able to accurately estimate the size of the area infected. Without application of vaccination, the model predicted a much higher mean number of culled animals than with vaccination, demonstrating that vaccination was likely crucial in disease control for the Swiss FMD outbreak in 1965/66. The study demonstrated the feasibility to analyze historical outbreak data with modern analytical tools. However, it also confirmed that predicted epidemics from a most carefully parameterized model cannot integrate all eventualities of a real epidemic. Therefore, decision makers need to be aware that infectious disease models are useful tools to support the decision-making process but their results are not equal valuable as real observations and should always be interpreted with caution.
Resumo:
Land and water management in semi-arid regions requires detailed information on precipitation distribution, including extremes, and changes therein. Such information is often lacking. This paper describes statistics of mean and extreme precipitation in a unique data set from the Mount Kenya region, encompassing around 50 stations with at least 30 years of data. We describe the data set, including quality control procedures and statistical break detection. Trends in mean precipitation and extreme indices calculated from these data for individual rainy seasons are compared with corresponding trends in reanalysis products. From 1979 to 2011, mean precipitation decreased at 75% of the stations during the ‘long rains’ (March to May) and increased at 70% of the stations during the ‘short rains’ (October to December). Corresponding trends are found in the number of heavy precipitation days, and maximum of consecutive 5-day precipitation. Conversely, an increase in consecutive dry days within both main rainy seasons is found. However, trends are only statistically significant in very few cases. Reanalysis data sets agree with observations with respect to interannual variability, while correlations are considerably lower for monthly deviations (ratios) from the mean annual cycle. While some products well reproduce the rainfall climatology and some the spatial trend pattern, no product reproduces both.
Resumo:
Coronary artery disease (CAD) is a multifactorial disease process involving behavioral, inflammatory, clinical, thrombotic, and genetic components. Previous epidemiologic studies focused on identifying behavioral and demographic risk factors of CAD, but none focused on platelets. Current platelet literature lacks the known effects of platelet function and platelet receptor polymorphisms on CAD. This case-control analysis addressed these issues by analyzing data collected for a previous study. Cases were individuals who had undergone CABG and thus had been diagnosed with CAD, while the controls were volunteers presumed to be CAD free. The platelet function variables analyzed included fibrinogen Von Willebrand Factor activity (VWF), shear-induced platelet aggregation (SIPA), sCD40L, and mean platelet volume; and the platelet polymorphisms studied included PIA, α2 807, Ko, Kozak, and VNTR. Univariate analysis found fibrinogen, VWF, SIPA, and PIA to be independent risk factors of CAD. Logistic regression was used to build a predictive model for CAD using the platelet function and platelet polymorphism data adjusted for age, sex, race, and current smoking status. A model containing only platelet polymorphisms and their respective receptor densities, found polymorphisms within GPIbα to be associated with CAD, yielding an 86% (95% C.I. 0.97–3.55) increased risk with the presence of at least 1 polymorphism in Ko, Kozak, or VNTR. Another model included both platelet function and platelet polymorphism data. Fibrinogen, the receptor density of GPIbα, and the polymorphism in GPIa-IIa (α2 807) were all associated with CAD with odds ratios of 1.10, 1.04, and 2.30 for fibrinogen (10mg/dl increase), GPIbα receptors (1 MFI increase), and GPIa-IIa, respectively. In addition, risk estimates and 99% confidence intervals adjusted for race were calculated to determine if the presence of a platelet receptor polymorphism was associated with CAD. The results were as follows: PIA (1.64, 0.74–3.65); α2 807 (1.35, 0.77–2.37); Ko (1.71, 0.70–4.16); Kozak (1.17, 0.54–2.52); and VNTR (1.24, 0.52–2.91). Although not statistically significant, all platelet polymorphisms were associated with an increased risk for CAD. These exploratory findings indicate that platelets do appear to have a role in atherosclerosis and that anti-platelet drugs targeting GPI-IIa and GPIbα may be better treatment candidates for individuals with CAD. ^
Resumo:
With substance abuse treatment expanding in prisons and jails, understanding how behavior change interacts with a restricted setting becomes more essential. The Transtheoretical Model (TTM) has been used to understand intentional behavior change in unrestricted settings, however, evidence indicates restrictive settings can affect the measurement and structure of the TTM constructs. The present study examined data from problem drinkers at baseline and end-of-treatment from three studies: (1) Project CARE (n = 187) recruited inmates from a large county jail; (2) Project Check-In (n = 116) recruited inmates from a state prison; (3) Project MATCH, a large multi-site alcohol study had two recruitment arms, aftercare (n = 724 pre-treatment and 650 post-treatment) and outpatient (n = 912 pre-treatment and 844 post-treatment). The analyses were conducted using cross-sectional data to test for non-invariance of measures of the TTM constructs: readiness, confidence, temptation, and processes of change (Structural Equation Modeling, SEM) across restricted and unrestricted settings. Two restricted (jail and aftercare) and one unrestricted group (outpatient) entering treatment and one restricted (prison) and two unrestricted groups (aftercare and outpatient) at end-of-treatment were contrasted. In addition TTM end-of-treatment profiles were tested as predictors of 12 month drinking outcomes (Profile Analysis). Although SEM did not indicate structural differences in the overall TTM construct model across setting types, there were factor structure differences on the confidence and temptation constructs at pre-treatment and in the factor structure of the behavioral processes at the end-of-treatment. For pre-treatment temptation and confidence, differences were found in the social situations factor loadings and in the variance for the confidence and temptation latent factors. For the end-of-treatment behavioral processes, differences across the restricted and unrestricted settings were identified in the counter-conditioning and stimulus control factor loadings. The TTM end-of-treatment profiles were not predictive of drinking outcomes in the prison sample. Both pre and post-treatment differences in structure across setting types involved constructs operationalized with behaviors that are limited for those in restricted settings. These studies suggest the TTM is a viable model for explicating addictive behavior change in restricted settings but calls for modification of subscale items that refer to specific behaviors and caution in interpreting the mean differences across setting types for problem drinkers. ^
Resumo:
Random Forests™ is reported to be one of the most accurate classification algorithms in complex data analysis. It shows excellent performance even when most predictors are noisy and the number of variables is much larger than the number of observations. In this thesis Random Forests was applied to a large-scale lung cancer case-control study. A novel way of automatically selecting prognostic factors was proposed. Also, synthetic positive control was used to validate Random Forests method. Throughout this study we showed that Random Forests can deal with large number of weak input variables without overfitting. It can account for non-additive interactions between these input variables. Random Forests can also be used for variable selection without being adversely affected by collinearities. ^ Random Forests can deal with the large-scale data sets without rigorous data preprocessing. It has robust variable importance ranking measure. Proposed is a novel variable selection method in context of Random Forests that uses the data noise level as the cut-off value to determine the subset of the important predictors. This new approach enhanced the ability of the Random Forests algorithm to automatically identify important predictors for complex data. The cut-off value can also be adjusted based on the results of the synthetic positive control experiments. ^ When the data set had high variables to observations ratio, Random Forests complemented the established logistic regression. This study suggested that Random Forests is recommended for such high dimensionality data. One can use Random Forests to select the important variables and then use logistic regression or Random Forests itself to estimate the effect size of the predictors and to classify new observations. ^ We also found that the mean decrease of accuracy is a more reliable variable ranking measurement than mean decrease of Gini. ^
Resumo:
Rabies remains a significant problem in much of the developed world, where canine rabies is not well controlled, and the bite of an infected dog is the most common means of transmission. The Philippines continues to report several hundred cases of human rabies every year, and many more cases go undetected. In recent years, the province of Bohol has been targeted by the Philippine government and the World Health Organization for a rabies eradication program. ^ The primary objective of this dissertation research was to describe factors associated with dog vaccination coverage and knowledge, attitudes, and practices regarding rabies among households in Bohol, Philippines. Utilizing a cross-sectional cluster survey design, we sampled 460 households and 541 dogs residing within dog-owning households. ^ Multivariate linear regression was used to examine potential associations between knowledge, attitudes, and practices (KAPs) and variables of interest. Forty-six percent of households knew that rabies was spread through the bite of an infected dog. The mean knowledge score was 8.36 (SD: ± 3.4; range: 1–24). We found that having known someone with rabies was significantly associated with an almost one point increase in the knowledge score (β = 0.88; p = 0.02). The mean attitudes score was 5.65 (SD: ± 0.63; range: 2–6), and the mean practices score was 7.07 (SD: ± 1.7; range: 2–9). Both the attitudes score and the practices score were positively and significantly associated with only the knowledge score and no other covariates. ^ Multivariate logistic regression was used to examine associations between dog vaccination coverage and variables of interest. Approximately 71% of owned dogs in Bohol were reported as vaccinated at some time during their lives. We found that a dog's age was significantly associated with vaccination, and the odds of vaccination increased in a linear fashion with age. We also found that dogs had approximately twice the odds of being vaccinated if they were confined both day and night to the household premises or if the owner was employed; however, these results were only marginally significant (p = 0.07) in the multivariate model. ^ Finally, a systematic review was conducted on canine rabies vaccination and dog population demographics in the developing world. We found few studies on this topic, especially in countries where the burden of rabies is greatest. Overall, dog ownership is high. Dogs are quite young and do not live very long due to disease and accidents. The biggest deterrent to vaccination is the rapid dog population turnover. ^ It is our hope that this work will be used to improve dog rabies vaccination programs around the world and save lives, both human and canine.^
Resumo:
Background: Hypertension and Diabetes is a public health and economic concern in the United States. The utilization of medical home concepts increases the receipt of preventive services, however, do they also increase adherence to treatments? This study examined the effect of patient-centered medical home technologies such as the electronic health record, clinical support system, and web-based care management in improving health outcomes related to hypertension and diabetes. Methods: A systematic review of the literature used a best evidence synthesis approach to address the general question " Do patient-centered medical home technologies have an effect of diabetes and hypertension treatment?" This was followed by an evaluation of specific examples of the technologies utilized such as computer-assisted recommendations and web-based care management provided by the patient's electronic health record. Ebsco host, Ovid host, and Google Scholar were the databases used to conduct the literature search. Results: The initial search identified over 25 studies based on content and quality that implemented technology interventions to improve communication between provider and patient. After further assessing the articles for risk of bias and study design, 13 randomized controlled studies were chosen. All of the studies chosen were conducted in various primary care settings in both private practices and hospitals between the years 2000 and 2007. The sample sizes of the studies ranged from 42 to 2924 participants. The mean age for all of the studies ranged from 56 to 71 years. The percent women in the studies ranged from one to 78 percent. Over one-third of the studies did not provide the racial composition of the participants. For the seven studies that did provide information about the ethnic composition, 64% of the intervention participants were White. All of the studies utilized some type of web-based or computer-based communication to manage hypertension or diabetes care. Findings on outcomes were mixed, with nine out of 13 studies showing no significant effect on outcomes examined, and four of the studies showing significant and positive impact on health outcomes related to hypertension or diabetes Conclusion: Although the technologies improved patient and provider satisfaction, the outcomes measures such as blood pressure control and glucose control were inconclusive. Further research is needed with diverse ethnic and SES population to investigate the role of patient-centered technologies on hypertension and diabetes control. Also, further research is needed to investigate the effects of innovative medical home technologies that can be used by both patients and providers to increase quality of communication concerning adherence to treatments.^
Resumo:
OBJECTIVE. To determine the effectiveness of active surveillance cultures and associated infection control practices on the incidence of methicillin resistant Staphylococcus aureus (MRSA) in the acute care setting. DESIGN. A historical analysis of existing clinical data utilizing an interrupted time series design. ^ SETTING AND PARTICIPANTS. Patients admitted to a 260-bed tertiary care facility in Houston, TX between January 2005 through December 2010. ^ INTERVENTION. Infection control practices, including enhanced barrier precautions, compulsive hand hygiene, disinfection and environmental cleaning, and executive ownership and education, were simultaneously introduced during a 5-month intervention implementation period culminating with the implementation of active surveillance screening. Beginning June 2007, all high risk patients were cultured for MRSA nasal carriage within 48 hours of admission. Segmented Poisson regression was used to test the significance of the difference in incidence of healthcare-associated MRSA during the 29-month pre-intervention period compared to the 43-month post-intervention period. ^ RESULTS. A total of 9,957 of 11,095 high-risk patients (89.7%) were screened for MRSA carriage during the intervention period. Active surveillance cultures identified 1,330 MRSA-positive patients (13.4%) contributing to an admission prevalence of 17.5% in high-risk patients. The mean rate of healthcare-associated MRSA infection and colonization decreased from 1.1 per 1,000 patient-days in the pre-intervention period to 0.36 per 1,000 patient-days in the post-intervention period (P<0.001). The effect of the intervention in association with the percentage of S. aureus isolates susceptible to oxicillin were shown to be statistically significantly associated with the incidence of MRSA infection and colonization (IRR = 0.50, 95% CI = 0.31-0.80 and IRR = 0.004, 95% CI = 0.00003-0.40, respectively). ^ CONCLUSIONS. It can be concluded that aggressively targeting patients at high risk for colonization of MRSA with active surveillance cultures and associated infection control practices as part of a multifaceted, hospital-wide intervention is effective in reducing the incidence of healthcare-associated MRSA.^
Resumo:
Geostrophic surface velocities can be derived from the gradients of the mean dynamic topography-the difference between the mean sea surface and the geoid. Therefore, independently observed mean dynamic topography data are valuable input parameters and constraints for ocean circulation models. For a successful fit to observational dynamic topography data, not only the mean dynamic topography on the particular ocean model grid is required, but also information about its inverse covariance matrix. The calculation of the mean dynamic topography from satellite-based gravity field models and altimetric sea surface height measurements, however, is not straightforward. For this purpose, we previously developed an integrated approach to combining these two different observation groups in a consistent way without using the common filter approaches (Becker et al. in J Geodyn 59(60):99-110, 2012, doi:10.1016/j.jog.2011.07.0069; Becker in Konsistente Kombination von Schwerefeld, Altimetrie und hydrographischen Daten zur Modellierung der dynamischen Ozeantopographie, 2012, http://nbn-resolving.de/nbn:de:hbz:5n-29199). Within this combination method, the full spectral range of the observations is considered. Further, it allows the direct determination of the normal equations (i.e., the inverse of the error covariance matrix) of the mean dynamic topography on arbitrary grids, which is one of the requirements for ocean data assimilation. In this paper, we report progress through selection and improved processing of altimetric data sets. We focus on the preprocessing steps of along-track altimetry data from Jason-1 and Envisat to obtain a mean sea surface profile. During this procedure, a rigorous variance propagation is accomplished, so that, for the first time, the full covariance matrix of the mean sea surface is available. The combination of the mean profile and a combined GRACE/GOCE gravity field model yields a mean dynamic topography model for the North Atlantic Ocean that is characterized by a defined set of assumptions. We show that including the geodetically derived mean dynamic topography with the full error structure in a 3D stationary inverse ocean model improves modeled oceanographic features over previous estimates.
Resumo:
Extreme weather events can have negative impacts on species survival and community structure when surpassing lethal thresholds. Extreme winter warming events in the Arctic rapidly melt snow and expose ecosystems to unseasonably warm air (2-10 °C for 2-14 days), but returning to cold winter climate exposes the ecosystem to lower temperatures by the loss of insulating snow. Soil animals, which play an integral part in soil processes, may be very susceptible to such events depending on the intensity of soil warming and low temperatures following these events. We simulated week-long extreme winter warming events - using infrared heating lamps, alone or with soil warming cables - for two consecutive years in a sub-Arctic dwarf shrub heathland. Minimum temperatures were lower and freeze-thaw cycles were 2-11 times more frequent in treatment plots compared with control plots. Following the second event, Acari populations decreased by 39%; primarily driven by declines of Prostigmata (69%) and the Mesostigmatic nymphs (74%). A community-weighted vertical stratification shift occurred from smaller soil dwelling (eu-edaphic) Collembola species dominance to larger litter dwelling (hemi-edaphic) species dominance in the canopy-with-soil warming plots compared with controls. The most susceptible groups to these winter warming events were the smallest individuals (Prostigmata and eu-edaphic Collembola). This was not apparent from abundance data at the Collembola taxon level, indicating that life forms and species traits play a major role in community assembly following extreme events. The observed shift in soil community can cascade down to the micro-flora affecting plant productivity and mineralization rates. Short-term extreme weather events have the potential to shift community composition through trait composition with potentially large consequences for ecosystem development.
Resumo:
Las "orugas defoliadoras" afectan la producción del cultivo de soja, sobre todo en años secos y con altas temperaturas que favorecen su desarrollo. El objetivo del presente trabajo fue evaluar la eficiencia de control de insecticidas neurotóxicos e IGRs sobre "orugas defoliadoras" en soja. Se realizaron ensayos en lotes comerciales en tres localidades de la provincia de Córdoba en las campañas agrícolas 2008/09 y 2009/10, bajo un diseño de bloques al azar, con seis tratamientos y tres repeticiones. Los tratamientos fueron: T1: Clorpirifos (384 g p.a.ha-1), T2: Cipermetrina (37,5 g p.a.ha-1), T3: Lufenuron+Profenofos (15 + 150 g p.a.ha-1), T4: Metoxifenocide (28,8 g p.a.ha-1), T5: Novaluron (10 g p.a.ha-1) y T6: Testigo. El tamaño de las parcelas fue de 12 surcos de 10 m de largo distanciados a 0,52 m. La aplicación se realizó con una mochila provista de boquillas de cono hueco (40 gotas.cm-2), cuando la plaga alcanzó el umbral de daño económico. En cada parcela se tomaron cinco muestras a los 0, 2, 7 y 14 días después de la aplicación (DDA) utilizando el paño vertical, identificando y cuantificando las orugas vivas mayores a 1,5 cm. A los 14 DDA se extrajeron 30 folíolos por parcela (estrato medio y superior de la planta) y se determinó el porcentaje de defoliación utilizando el software WinFolia Reg. 2004. Se estimó el rendimiento sobre 5 muestras de 1 m2 en cada parcela y se realizó ANOVA y test de comparación de medias LSD de Fisher. El Clorpirifos mostró el mayor poder de volteo y el Metoxifenocide la mayor eficiencia a los 7 DDA. En general los IGRs mostraron mayor poder residual.
Resumo:
The Laurichard active rock glacier is the permafrost-related landform with the longest record of monitoring in France, including an annual geodetic survey, repeated geoelectrical campaigns from 1979 onwards and continuous recording of ground temperature since 2003. These data were used to examine changes in creep rates and internal structure from 1986 to 2006. The control that climatic variables exert on rock glacier kinematics was investigated over three time scales. Between the 1980s and the early 2000s, the main observed changes were a general increase in surface velocity and a decrease in internal resistivity. At a multi-year scale, the high correlation between surface movement and snow thickness in the preceding December appears to confirm the importance of snow cover conditions in early winter through their influence on the ground thermal regime. A comparison of surface velocities, regional climatic datasets and ground sub-surface temperatures over six years suggests a strong relation between rock glacier deformation and ground temperature, as well as a role for liquid water due to melt of thick snow cover. Finally, unusual surface lowering that accompanied peak velocities in 2004 may be due to a general thaw of the top of the permafrost, probably caused both by two successive snowy winters and by high energy inputs during the warm summer of 2003.
Resumo:
Nutrient supply in the area off Northwest Africa is mainly regulated by two processes, coastal upwelling and deposition of Saharan dust. In the present study, both processes were analyzed and evaluated by different methods, including cross-correlation, multiple correlation, and event statistics, using remotely sensed proxies of the period from 2000 to 2008 to investigate their influence on the marine environment. The remotely sensed chlorophyll-a concentration was used as a proxy for the phytoplankton biomass stimulated by nutrient supply into the euphotic zone from deeper water layers and from the atmosphere. Satellite-derived alongshore wind stress and sea-surface temperature were applied as proxies for the strength and reflection of coastal upwelling processes. The westward wind and the dust component of the aerosol optical depth describe the transport direction of atmospheric dust and the atmospheric dust column load. Alongshore wind stress and induced upwelling processes were most significantly responsible for the surface chlorophyll-a variability, accounting for about 24% of the total variance, mainly in the winter and spring due to the strong north-easterly trade winds. The remotely sensed proxies allowed determination of time lags between biological response and its forcing processes. A delay of up to 16 days in the surface chlorophyll-a concentration due to the alongshore wind stress was determined in the northern winter and spring. Although input of atmospheric iron by dust storms can stimulate new phytoplankton production in the study area, only 5% of the surface chlorophyll-a variability could be ascribed to the dust component in the aerosol optical depth. All strong desert storms were identified by an event statistics in the time period from 2000 to 2008. The 57 strong storms were studied in relation to their biological response. Six events were clearly detected in which an increase of chlorophyll-a was caused by Saharan dust input and not by coastal upwelling processes. Time lags of <8 days, 8 days, and 16 days were determined. An increase in surface chlorophyll-a concentration of up to 2.4 mg m**3 after dust storms in which the dust component of the aerosol optical depth was up to 0.9 was observed.