971 resultados para Statistical modelling


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This review considers microbial inocula used in in vitro systems from the perspective of their ability to degrade or ferment a particular substrate, rather than the microbial species that it contains. By necessity, this required an examination of bacterial, protozoal and fungal populations of the rumen and hindgut with respect to factors influencing their activity. The potential to manipulate these populations through diet or sampling time are examined, as is inoculum preparation and level. The main alternatives to fresh rumen fluid (i.e., caecal digesta or faeces) are discussed with respect to end-point degradabilities and fermentation dynamics. Although the potential to use rumen contents obtained from donor animals at slaughter offers possibilities, the requirement to store it and its subsequent loss of activity are limitations. Statistical modelling of data, although still requiring a deal of developmental work, may offer an alternative approach. Finally, with respect to the range of in vitro methodologies and equipment employed, it is suggested that a degree of uniformity could be obtained through generation of a set of guidelines relating to the host animal, sampling technique and inoculum preparation. It was considered unlikely that any particular system would be accepted as the 'standard' procedure. However, before any protocol can be adopted, additional data are required (e.g., a method to assess inoculum 'quality' with respect to its fermentative and/or degradative activity), preparation/inoculation techniques need to be refined and a methodology to store inocula without loss of efficacy developed. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The conventional method for assessing acute oral toxicity (OECD Test Guideline 401) was designed to identify the median lethal dose (LD50), using the death of animals as an endpoint. Introduced as an alternative method (OECD Test Guideline 420), the Fixed Dose Procedure (FDP) relies on the observation of clear signs of toxicity, uses fewer animals and causes less suffering. More recently, the Acute Toxic Class method and the Up-and-Down Procedure have also been adopted as OECD test guidelines. Both of these methods also use fewer animals than the conventional method, although they still use death as an endpoint. Each of the three new methods incorporates a sequential dosing procedure, which results in increased efficiency. In 1999, with a view to replacing OECD Test Guideline 401, the OECD requested that the three new test guidelines be updated. This was to bring them in line with the regulatory needs of all OECD Member Countries, provide further reductions in the number of animals used, and introduce refinements to reduce the pain and distress experienced by the animals. This paper describes a statistical modelling approach for the evaluation of acute oral toxicity tests, by using the revised FDP for illustration. Opportunities for further design improvements are discussed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

There have been few rigorous assessments of the effectiveness of participatory processes for natural resource management. In Bangladesh an approach known as Participatory Action Plan Development (PAPD) has been developed and applied. By combining problem identification and solution analysis by separate stakeholder groups with plenary sessions it is claimed to result in consensus and more effective community based management. Methodological issues in assessing the effectiveness of such development are discussed and good practice illustrated. Under the same project there were sites where PAPD had been used and others without its use so a comparative assessment could be made. However, for an appropriate assessment it is important to identify clear testable hypotheses regarding the expected benefits, appropriate measures, and other factors which may affect or confound the outcome. The paper illustrates how participatory assessment involving both individual opinions and focus groups can be systematically recorded, quantified and used with other data in statistical analysis. By using statistical modelling methods at an appropriate level of aggregation and controlling for other factors, benefits from PAPD were found to be significant. The systematic approaches and practices recommended from this example can be applied in similar situations to test the effectiveness of participatory processes using participatory assessments.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study used the novel approach of statistical modelling to investigate the control of hypothalamic-pituitary-adrenal (HPA) axis and quantify temporal relationships between hormones. Two experimental paradigms were chosen, insulin-induced hypoglycaemia and 2 h transport, to assess differences in control between noncognitive and cognitive stimuli. Vasopressin and corticotropin-releasing hormone (CRH) were measured in hypophysial portal plasma, and adrenocorticotropin hormone (ACTH) and cortisol in jugular plasma of conscious sheep, and deconvolution analysis was used to calculate secretory rates, before modelling. During hypoglycaemia, the relationship between plasma glucose and vasopressin or CRH was best described by log(10) transforming variables (i.e. a positive power-curve relationship). A negative-feedback relationship with log(10) cortisol concentration 2 h previously was detected. Analysis of the 'transport' stimulus suggested that the strength of the perceived stimulus decreased over time after accounting for cortisol facilitation and negative-feedback. The time course of vasopressin and CRH responses to each stimulus were different However, at the pituitary level, the data suggested that log(10) ACTH secretion rate was related to log(10) vasopressin and CRH concentrations with very similar regression coefficients and an identical ratio of actions (2.3 : 1) for both stimuli. Similar magnitude negative-feedback effects of log(10) cortisol at -110 min (hypoglycaemia) or -40 min (transport) were detected, and both models contained a stimulatory relationship with cortisol at 0 min (facilitation). At adrenal gland level, cortisol secretory rates were related to simultaneously measured untransformed ACTH concentration but the regression coefficient for the hypoglycaemia model was 2.5-fold greater than for transport. No individual sustained maximum cortisol secretion for longer than 20 min during hypoglycaemia and 40 min during transport. These unique models demonstrate that corticosteroid negative-feedback is a significant control mechanism at both the pituitary and hypothalamus. The amplitude of HPA response may be related to stimulus intensity and corticosteroid negative-feedback, while duration depended on feedback alone.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The quantification of uncertainty is an increasingly popular topic, with clear importance for climate change policy. However, uncertainty assessments are open to a range of interpretations, each of which may lead to a different policy recommendation. In the EQUIP project researchers from the UK climate modelling, statistical modelling, and impacts communities worked together on ‘end-to-end’ uncertainty assessments of climate change and its impacts. Here, we use an experiment in peer review amongst project members to assess variation in the assessment of uncertainties between EQUIP researchers. We find overall agreement on key sources of uncertainty but a large variation in the assessment of the methods used for uncertainty assessment. Results show that communication aimed at specialists makes the methods used harder to assess. There is also evidence of individual bias, which is partially attributable to disciplinary backgrounds. However, varying views on the methods used to quantify uncertainty did not preclude consensus on the consequential results produced using those methods. Based on our analysis, we make recommendations for developing and presenting statements on climate and its impacts. These include the use of a common uncertainty reporting format in order to make assumptions clear; presentation of results in terms of processes and trade-offs rather than only numerical ranges; and reporting multiple assessments of uncertainty in order to elucidate a more complete picture of impacts and their uncertainties. This in turn implies research should be done by teams of people with a range of backgrounds and time for interaction and discussion, with fewer but more comprehensive outputs in which the range of opinions is recorded.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Social network has gained remarkable attention in the last decade. Accessing social network sites such as Twitter, Facebook LinkedIn and Google+ through the internet and the web 2.0 technologies has become more affordable. People are becoming more interested in and relying on social network for information, news and opinion of other users on diverse subject matters. The heavy reliance on social network sites causes them to generate massive data characterised by three computational issues namely; size, noise and dynamism. These issues often make social network data very complex to analyse manually, resulting in the pertinent use of computational means of analysing them. Data mining provides a wide range of techniques for detecting useful knowledge from massive datasets like trends, patterns and rules [44]. Data mining techniques are used for information retrieval, statistical modelling and machine learning. These techniques employ data pre-processing, data analysis, and data interpretation processes in the course of data analysis. This survey discusses different data mining techniques used in mining diverse aspects of the social network over decades going from the historical techniques to the up-to-date models, including our novel technique named TRCM. All the techniques covered in this survey are listed in the Table.1 including the tools employed as well as names of their authors.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we assess opinion polls, prediction markets, expert opinion and statistical modelling over a large number of US elections in order to determine which perform better in terms of forecasting outcomes. In line with existing literature, we bias-correct opinion polls. We consider accuracy, bias and precision over different time horizons before an election, and we conclude that prediction markets appear to provide the most precise forecasts and are similar in terms of bias to opinion polls. We find that our statistical model struggles to provide competitive forecasts, while expert opinion appears to be of value. Finally we note that the forecast horizon matters; whereas prediction market forecasts tend to improve the nearer an election is, opinion polls appear to perform worse, while expert opinion performs consistently throughout. We thus contribute to the growing literature comparing election forecasts of polls and prediction markets.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

For the first time, we introduce a class of transformed symmetric models to extend the Box and Cox models to more general symmetric models. The new class of models includes all symmetric continuous distributions with a possible non-linear structure for the mean and enables the fitting of a wide range of models to several data types. The proposed methods offer more flexible alternatives to Box-Cox or other existing procedures. We derive a very simple iterative process for fitting these models by maximum likelihood, whereas a direct unconditional maximization would be more difficult. We give simple formulae to estimate the parameter that indexes the transformation of the response variable and the moments of the original dependent variable which generalize previous published results. We discuss inference on the model parameters. The usefulness of the new class of models is illustrated in one application to a real dataset.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Climate change is a naturally occurring phenomenon in which the earth‘s climate goes through cycles of warming and cooling; these changes usually take place incrementally over millennia. Over the past century, there has been an anomalous increase in global temperature, giving rise to accelerated climate change. It is widely accepted that greenhouse gas emissions from human activities such as industries have contributed significantly to the increase in global temperatures. The existence and survival of all living organisms is predicated on the ability of the environment in which they live not only to provide conditions for their basic needs but also conditions suitable for growth and reproduction. Unabated climate change threatens the existence of biophysical and ecological systems on a planetary scale. The present study aims to examine the economic impact of climate change on health in Jamaica over the period 2011-2050. To this end, three disease conditions with known climate sensitivity and importance to Jamaican public health were modelled. These were: dengue fever, leptospirosis and gastroenteritis in children under age 5. Historical prevalence data on these diseases were obtained from the Ministry of Health Jamaica, the Caribbean Epidemiology Centre, the Climate Studies Group Mona, University of the West Indies Mona campus, and the Meteorological Service of Jamaica. Data obtained spanned a twelve-year period of 1995-2007. Monthly data were obtained for dengue and gastroenteritis, while for leptospirosis, the annual number of cases for 1995-2005 was utilized. The two SRES emission scenarios chosen were A2 and B2 using the European Centre Hamburg Model (ECHAM) global climate model to predict climate variables for these scenarios. A business as usual (BAU) scenario was developed using historical disease data for the period 2000-2009 (dengue fever and gastroenteritis) and 1995-2005 (leptospirosis) as the reference decades for the respective diseases. The BAU scenario examined the occurrence of the diseases in the absence of climate change. It assumed that the disease trend would remain unchanged over the projected period and the number of cases of disease for each decade would be the same as the reference decade. The model used in the present study utilized predictive empirical statistical modelling to extrapolate the climate/disease relationship in time, to estimate the number of climate change-related cases under future climate change scenarios. The study used a Poisson regression model that considered seasonality and lag effects to determine the best-fit model in relation to the diseases under consideration. Zhang and others (2008), in their review of climate change and the transmission of vector-borne diseases, found that: ―Besides climatic variables, few of them have included other factors that can affect the transmission of vector-borne disease….‖ (Zhang 2008) Water, sanitation and health expenditure are key determinants of health. In the draft of the second communication to IPCC, Jamaica noted the vulnerability of public health to climate change, including sanitation and access to water (MSJ/UNDP, 2009). Sanitation, which in its broadest context includes the removal of waste (excreta, solid, or other hazardous waste), is a predictor of vector-borne diseases (e.g. dengue fever), diarrhoeal diseases (such as gastroenteritis) and zoonoses (such as leptospirosis). In conceptualizing the model, an attempt was made to include non-climate predictors of these climate-sensitive diseases. The importance of sanitation and water access to the control of dengue, gastroenteritis and leptospirosis were included in the Poisson regression model. The Poisson regression model obtained was then used to predict the number of disease cases into the future (2011-2050) for each emission scenario. After projecting the number of cases, the cost associated with each scenario was calculated using four cost components. 1. Treatment cost morbidity estimate. The treatment cost for the number of cases was calculated using reference values found in the literature for each condition. The figures were derived from studies of the cost of treatment and represent ambulatory and non-fatal hospitalized care for dengue fever and gastroenteritis. Due to the paucity of published literature on the health care cost associated with leptospirosis, only the cost of diagnosis and antibiotic therapy were included in the calculation. 2. Mortality estimates. Mortality estimates are recorded as case fatality rates. Where local data were available, these were utilized. Where these were unavailable, appropriate reference values from the literature were used. 3. Productivity loss. Productivity loss was calculated using a human capital approach, by multiplying the expected number of productive days lost by the caregiver and/or the infected person, by GDP per capita per day (US$ 14) at 2008 GDP using 2008 US$ exchange rates. 4. No-option cost. The no-option cost refers to adaptation strategies for the control of dengue fever which are ongoing and already a part of the core functions of the Vector Control Division of the Ministry of Health, Jamaica. An estimated US$ 2.1 million is utilized each year in conducting activities to prevent the post-hurricane spread of vector borne diseases and diarrhoea. The cost includes public education, fogging, laboratory support, larvicidal activities and surveillance. This no-option cost was converted to per capita estimates, using population estimates for Jamaica up to 2050 obtained from the Statistical Institute of Jamaica (STATIN, 2006) and the assumption of one expected major hurricane per decade. During the decade 2000-2009, Jamaica had an average inflation of 10.4% (CIA Fact book, last updated May 2011). This average decadal inflation rate was applied to the no-option cost, which was inflated by 10% for each successive decade to adjust for changes in inflation over time.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Studies in South-East Asia have suggested that early diagnosis and treatment with artesunate (AS) and mefloquine (MQ) combination therapy may reduce the transmission of Plasmodium falciparum malaria and the progression of MQ resistance. Methods: The effectiveness of a fixed-dose combination of AS and MQ (ASMQ) in reducing malaria transmission was tested in isolated communities of the Jurua valley in the Amazon region. Priority municipalities within the Brazilian Legal Amazon area were selected according to pre-specified criteria. Routine national malaria control programmatic procedures were followed. Existing health structures were reinforced and health care workers were trained to treat with ASMQ all confirmed falciparum malaria cases that match inclusion criteria. A local pharmacovigilance structure was implemented. Incidence of malaria and hospitalizations were recorded two years before, during, and after the fixed-dose ASMQ intervention. In total, between July 2006 and December 2008, 23,845 patients received ASMQ. Two statistical modelling approaches were applied to monthly time series of P. falciparum malaria incidence rates, P. falciparum/Plasmodium vivax infection ratio, and malaria hospital admissions rates. All the time series ranged from January 2004 to December 2008, whilst the intervention period span from July 2006 to December 2008. Results: The ASMQ intervention had a highly significant impact on the mean level of each time series, adjusted for trend and season, of 0.34 (95% CI 0.20 - 0.58) for the P. falciparum malaria incidence rates, 0.67 (95% CI 0.50 - 0.89) for the P. falciparum/P. vivax infection ratio, and 0.53 (95% CI 0.41 - 0.69) for the hospital admission rates. There was also a significant change in the seasonal (or monthly) pattern of the time series before and after intervention, with the elimination of the malaria seasonal peak in the rainy months of the years following the introduction of ASMQ. No serious adverse events relating to the use of fixed-dose ASMQ were reported. Conclusions: In the remote region of the Jurua valley, the early detection of malaria by health care workers and treatment with fixed-dose ASMQ was feasible and efficacious, and significantly reduced the incidence and morbidity of P. falciparum malaria.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we extend semiparametric mixed linear models with normal errors to elliptical errors in order to permit distributions with heavier and lighter tails than the normal ones. Penalized likelihood equations are applied to derive the maximum penalized likelihood estimates (MPLEs) which appear to be robust against outlying observations in the sense of the Mahalanobis distance. A reweighed iterative process based on the back-fitting method is proposed for the parameter estimation and the local influence curvatures are derived under some usual perturbation schemes to study the sensitivity of the MPLEs. Two motivating examples preliminarily analyzed under normal errors are reanalyzed considering some appropriate elliptical errors. The local influence approach is used to compare the sensitivity of the model estimates.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

[EN] Here we present monthly, basin-wide maps of the partial pressure of carbon dioxide (pCO2) for the North Atlantic on a latitude by longitude grid for years 2004 through 2006 inclusive. The maps have been computed using a neural network technique which reconstructs the non-linear relationships between three biogeochemical parameters and marine pCO2. A self organizing map (SOM) neural network has been trained using 389 000 triplets of the SeaWiFSMODIS chlorophyll-a concentration, the NCEP/NCAR reanalysis sea surface temperature, and the FOAM mixed layer depth. The trained SOM was labelled with 137 000 underway pCO2 measurements collected in situ during 2004, 2005 and 2006 in the North Atlantic, spanning the range of 208 to 437atm. The root mean square error (RMSE) of the neural network fit to the data is 11.6?atm, which equals to just above 3 per cent of an average pCO2 value in the in situ dataset. The seasonal pCO2 cycle as well as estimates of the interannual variability in the major biogeochemical provinces are presented and discussed. High resolution combined with basin-wide coverage makes the maps a useful tool for several applications such as the monitoring of basin-wide air-sea CO2 fluxes or improvement of seasonal and interannual marine CO2 cycles in future model predictions. The method itself is a valuable alternative to traditional statistical modelling techniques used in geosciences.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background information: During the late 1970s and the early 1980s, West Germany witnessed a reversal of gender differences in educational attainment, as females began to outperform males. Purpose: The main objective was to analyse which processes were behind the reversal of gender differences in educational attainment after 1945. The theoretical reflections and empirical evidence presented for the US context by DiPrete and Buchmann (Gender-specific trends in the value of education and the emerging gender gap in college completion, Demography 43: 1–24, 2006) and Buchmann, DiPrete, and McDaniel (Gender inequalities in education, Annual Review of Sociology 34: 319–37, 2008) are considered and applied to the West German context. It is suggested that the reversal of gender differences is a consequence of the change in female educational decisions, which are mainly related to labour market opportunities and not, as sometimes assumed, a consequence of a ‘boy’s crisis’. Sample: Several databases, such as the German General Social Survey, the German Socio-economic Panel and the German Life History Study, are employed for the longitudinal analysis of the educational and occupational careers of birth cohorts born in the twentieth century. Design and methods: Changing patterns of eligibility for university studies are analysed for successive birth cohorts and gender. Binary logistic regressions are employed for the statistical modelling of the individuals’ achievement, educational decision and likelihood for social mobility – reporting average marginal effects (AME). Results: The empirical results suggest that women’s better school achievement being constant across cohorts does not contribute to the explanation of the reversal of gender differences in higher education attainment, but the increase of benefits for higher education explains the changing educational decisions of women regarding their transition to higher education. Conclusions: The outperformance of females compared with males in higher education might have been initialised by several social changes, including the expansion of public employment, the growing demand for highly qualified female workers in welfare and service areas, the increasing returns of women’s increased education and training, and the improved opportunities for combining family and work outside the home. The historical data show that, in terms of (married) women’s increased labour market opportunities and female life-cycle labour force participation, the raising rates of women’s enrolment in higher education were – among other reasons – partly explained by their rising access to service class positions across birth cohorts, and the rise of their educational returns in terms of wages and long-term employment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Published birthweight references in Australia do not fully take into account constitutional factors that influence birthweight and therefore may not provide an accurate reference to identify the infant with abnormal growth. Furthermore, studies in other regions that have derived adjusted (customised) birthweight references have applied untested assumptions in the statistical modelling. Aims: To validate the customised birthweight model and to produce a reference set of coefficients for estimating a customised birthweight that may be useful for maternity care in Australia and for future research. Methods: De-identified data were extracted from the clinical database for all births at the Mater Mother's Hospital, Brisbane, Australia, between January 1997 and June 2005. Births with missing data for the variables under study were excluded. In addition the following were excluded: multiple pregnancies, births less than 37 completed week's gestation, stillbirths, and major congenital abnormalities. Multivariate analysis was undertaken. A double cross-validation procedure was used to validate the model. Results: The study of 42 206 births demonstrated that, for statistical purposes, birthweight is normally distributed. Coefficients for the derivation of customised birthweight in an Australian population were developed and the statistical model is demonstrably robust. Conclusions: This study provides empirical data as to the robustness of the model to determine customised birthweight. Further research is required to define where normal physiology ends and pathology begins, and which segments of the population should be included in the construction of a customised birthweight standard.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ecological regions are increasingly used as a spatial unit for planning and environmental management. It is important to define these regions in a scientifically defensible way to justify any decisions made on the basis that they are representative of broad environmental assets. The paper describes a methodology and tool to identify cohesive bioregions. The methodology applies an elicitation process to obtain geographical descriptions for bioregions, each of these is transformed into a Normal density estimate on environmental variables within that region. This prior information is balanced with data classification of environmental datasets using a Bayesian statistical modelling approach to objectively map ecological regions. The method is called model-based clustering as it fits a Normal mixture model to the clusters associated with regions, and it addresses issues of uncertainty in environmental datasets due to overlapping clusters.