972 resultados para risk modelling
Resumo:
We present the first mathematical model on the transmission dynamics of Schistosoma japonicum. The work extends Barbour's classic model of schistosome transmission. It allows for the mammalian host heterogeneity characteristic of the S. japonicum life cycle, and solves the problem of under-specification of Barbour's model by the use of Chinese data we are collecting on human-bovine transmission in the Poyang Lake area of Jiangxi Province in China. The model predicts that in the lake/marshland areas of the Yangtze River basin: (1) once-early mass chemotherapy of humans is little better than twice-yearly mass chemotherapy in reducing human prevalence. Depending on the heterogeneity of prevalence within the population, targeted treatment of high prevalence groups, with lower overall coverage, can be more effective than mass treatment with higher overall coverage. Treatment confers a short term benefit only, with prevalence rising to endemic levels once chemotherapy programs are stopped (2) depending on the relative contributions of bovines and humans, bovine treatment can benefit humans almost as much as human treatment. Like human treatment, bovine treatment confers a short-term benefit. A combination of human and bovine treatment will dramatically reduce human prevalence and maintains the reduction for a longer period of time than treatment of a single host, although human prevalence rises once treatment ceases; (3) assuming 75% coverage of bovines, a bovine vaccine which acts on worm fecundity must have about 75% efficacy to reduce the reproduction rate below one and ensure mid-term reduction and long-term elimination of the parasite. Such a vaccination program should be accompanied by an initial period of human treatment to instigate a short-term reduction in prevalence, following which the reduction is enhanced by vaccine effects; (4) if the bovine vaccine is only 45% efficacious (the level of current prototype vaccines) it will lower the endemic prevalence, but will not result in elimination. If it is accompanied by an initial period of human treatment and by a 45% improvement in human sanitation or a 30% reduction in contaminated water contact by humans, elimination is then possible. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Objective To determine the relative importance of recognised risk factors for non-haemorrhagic stroke, including serum cholesterol and the effect of cholesterol-lowering therapy, on the occurrence of non-haemorrhagic stroke in patients enrolled in the LIPID (Long-term Intervention with Pravastatin in Ischaemic Disease) study. Design The LIPID study was a placebo-controlled, double-blind trial of the efficacy on coronary heart disease mortality of pravastatin therapy over 6 years in 9014 patients with previous acute coronary syndromes and baseline total cholesterol of 4-7 mmol/l. Following identification of patients who had suffered non-haemorrhagic stroke, a pre-specified secondary end point, multivariate Cox regression was used to determine risk in the total population. Time-to-event analysis was used to determine the effect of pravastatin therapy on the rate of non-haemorrhagic stroke. Results There were 388 non-haemorrhagic strokes in 350 patients. Factors conferring risk of future non-haemorrhagic stroke were age, atrial fibrillation, prior stroke, diabetes, hypertension, systolic blood pressure, cigarette smoking, body mass index, male sex and creatinine clearance. Baseline lipids did not predict non-haemorrhagic stroke. Treatment with pravastatin reduced non-haemorrhagic stroke by 23% (P= 0.016) when considered alone, and 21% (P= 0.024) after adjustment for other risk factors. Conclusions The study confirmed the variety of risk factors for non-haemorrhagic stroke. From the risk predictors, a simple prognostic index was created for nonhaemorrhagic stroke to identify a group of patients at high risk. Treatment with pravastatin resulted in significant additional benefit after allowance for risk factors. (C) 2002 Lippincott Williams Wilkins.
Resumo:
Activated sludge flocculation was modelled using population balances. The model followed the dynamics of activated sludge flocculation providing a good approximation of the change in mean floe size with time. Increasing the average velocity gradient decreased the final floe size. The breakage rate coefficient and collision efficiency also varied with the average velocity gradient. A power law relationship was found for the increase in breakage rate coefficient with increasing average velocity gradient. Further investigation will be conducted to determine the relationship between the collision efficiency and particle size to provide a better approximation of dynamic changes in the floe size distribution during flocculation. (C) 2002 Published by Elsevier Science B.V.
Resumo:
A technique based on laser light diffraction is shown to be successful in collecting on-line experimental data. Time series of floc size distributions (FSD) under different shear rates (G) and calcium additions were collected. The steady state mass mean diameter decreased with increasing shear rate G and increased when calcium additions exceeded 8 mg/l. A so-called population balance model (PBM) was used to describe the experimental data, This kind of model describes both aggregation and breakage through birth and death terms. A discretised PBM was used since analytical solutions of the integro-partial differential equations are non-existing. Despite the complexity of the model, only 2 parameters need to be estimated: the aggregation rate and the breakage rate. The model seems, however, to lack flexibility. Also, the description of the floc size distribution (FSD) in time is not accurate.
Resumo:
Objective: It has been suggested that parental occupation, particularly farming, increased the risk of Ewing's sarcoma in the offspring. In a national case-control study we examined the relationship between farm and other parental occupational exposures and the risk of cancer in the offspring. Methods: Cases were 106 persons with confirmed Ewing's sarcoma or peripheral primitive neuroectodermal tumor. Population-based controls (344) were selected randomly via telephone. Information was collected by interview (84% face-to-face). Results: We found an excess of case mothers who worked on farms at conception and/or pregnancy (odds ratio (OR) = 2.3, 95% confidence interval (CI) 0.5-12.0) and a slightly smaller excess of farming fathers; more case mothers usually worked as laborers, machine operators, or drivers (OR = 1.8, 95% CI 0.9-3.9). Risk doubled for those whose mothers handled pesticides and insecticides, or fathers who handled solvents and glues, and oils and greases. Further, more cases lived on farms (OR = 1.6, 95% CI 0.9-2.8). In the 0-20 years group, the risk doubled for those who ever lived on a farm (OR = 2.0, 95% CI 1.0-3.9), and more than tripled for those with farming fathers at conception and/or pregnancy (OR = 3.5, 95% CI 1.0-11.9). Conclusions: Our data support the general hypothesis of an association of Ewing's sarcoma family of tumors with farming, particularly at younger ages, who represent the bulk of cases, and are more likely to share etiologic factors.
Resumo:
The Agricultural Production Systems Simulator (APSIM) is a modular modelling framework that has been developed by the Agricultural Production Systems Research Unit in Australia. APSIM was developed to simulate biophysical process in farming systems, in particular where there is interest in the economic and ecological outcomes of management practice in the face of climatic risk. The paper outlines APSIM's structure and provides details of the concepts behind the different plant, soil and management modules. These modules include a diverse range of crops, pastures and trees, soil processes including water balance, N and P transformations, soil pH, erosion and a full range of management controls. Reports of APSIM testing in a diverse range of systems and environments are summarised. An example of model performance in a long-term cropping systems trial is provided. APSIM has been used in a broad range of applications, including support for on-farm decision making, farming systems design for production or resource management objectives, assessment of the value of seasonal climate forecasting, analysis of supply chain issues in agribusiness activities, development of waste management guidelines, risk assessment for government policy making and as a guide to research and education activity. An extensive citation list for these model testing and application studies is provided. Crown Copyright (C) 2002 Published by Elsevier Science B.V. All rights reserved.
Resumo:
We focus on mixtures of factor analyzers from the perspective of a method for model-based density estimation from high-dimensional data, and hence for the clustering of such data. This approach enables a normal mixture model to be fitted to a sample of n data points of dimension p, where p is large relative to n. The number of free parameters is controlled through the dimension of the latent factor space. By working in this reduced space, it allows a model for each component-covariance matrix with complexity lying between that of the isotropic and full covariance structure models. We shall illustrate the use of mixtures of factor analyzers in a practical example that considers the clustering of cell lines on the basis of gene expressions from microarray experiments. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
It is becoming increasingly clear that species of smaller body size tend to be less vulnerable to contemporary extinction threats than larger species, but few studies have examined the mechanisms underlying this pattern. In this paper, data for the Australian terrestrial mammal fauna are used to ask whether higher reproductive output or smaller home ranges can explain the reduced extinction risk of smaller species. Extinct and endangered species do indeed have smaller litters and larger home ranges for their body size than expected under a null model. In multiple regressions, however, only litter size is a significant predictor of extinction risk once body size and phylogeny are controlled for. Larger litters contribute to fast population growth, and are probably part of the reason that smaller species are less extinction-prone. The effect of litter size varies between the mesic coastal regions and the and interior of Australia, indicating that the environment a species inhabits mediates the effect of biology on extinction risk. These results suggest that predicting extinction risk from biological traits is likely to be a complex task which must consider explicitly interactions between biology and environment.
Resumo:
We compare Bayesian methodology utilizing free-ware BUGS (Bayesian Inference Using Gibbs Sampling) with the traditional structural equation modelling approach based on another free-ware package, Mx. Dichotomous and ordinal (three category) twin data were simulated according to different additive genetic and common environment models for phenotypic variation. Practical issues are discussed in using Gibbs sampling as implemented by BUGS to fit subject-specific Bayesian generalized linear models, where the components of variation may be estimated directly. The simulation study (based on 2000 twin pairs) indicated that there is a consistent advantage in using the Bayesian method to detect a correct model under certain specifications of additive genetics and common environmental effects. For binary data, both methods had difficulty in detecting the correct model when the additive genetic effect was low (between 10 and 20%) or of moderate range (between 20 and 40%). Furthermore, neither method could adequately detect a correct model that included a modest common environmental effect (20%) even when the additive genetic effect was large (50%). Power was significantly improved with ordinal data for most scenarios, except for the case of low heritability under a true ACE model. We illustrate and compare both methods using data from 1239 twin pairs over the age of 50 years, who were registered with the Australian National Health and Medical Research Council Twin Registry (ATR) and presented symptoms associated with osteoarthritis occurring in joints of the hand.
Resumo:
This paper presents an analysis of personal respirable coal dust measurements recorded by the Joint Coal Board in the underground longwall mines of New South Wales from 1985 to 1999. A description of the longwall mining process is given. In the study, 11 829 measurements from 33 mines were analysed and the results given for each occupation, for seven occupational groups, for individual de-identified mines and for each year of study. The mean respirable coal dust concentration for all jobs was 1.51 mg/m(3) (SD 1.08 mg/m(3)). Only 6.9% of the measurements exceeded the Australian exposure standard of 3 mg/m(3). Published exposure-response relationships were used to predict the prevalence of progressive massive fibrosis and the mean loss of FEV1, after a working lifetime (40 years) of exposure to the mean observed concentration of 1.5 mg/m(3). Prevalences of 1.3 and 2.9% were predicted, based on data from the UK and the USA, respectively. The mean loss of FEV1 was estimated to be 73.7 ml.
Resumo:
In this study, we examine an important factor that affects consumers' acceptance of business-to-commerce (B2C) electronic commerce - perceived risk. The objective of this paper is to examine the definition of perceived risk in the context of B2C electronic commerce. The paper highlights the importance of perceived risk and the interwoven relation between perceived risk and trust. It discusses the problem of defining perceived risk in prior B2C research. This study proposes a new classification of consumers' perceived risk based on sources. It highlights the importance of identifying the sources of consumer's risk perceptions in addition to the consequences dimensions. Two focus group discussion sessions were conducted to verify the proposed classification. Results indicate that Internet consumers perceive three sources of risk in B2C electronic commerce: technology, vendor, and product. © 2003 Elsevier B.V. All rights reserved.
Resumo:
Conceptual modelling is an activity undertaken during information systems development work to build a representation of selected semantics about some real-world domain. Ontological theories have been developed to account for the structure and behavior of the real world in general. In this paper, I discuss why ontological theories can be used to inform conceptual modelling research, practice, and pedagogy. I provide examples from my research to illustrate how a particular ontological theory has enabled me to improve my understanding of certain conceptual modelling practices and grammars. I describe, also, how some colleagues and I have used this theory to generate several counter-intuitive, sometimes surprising predictions about widely advocated conceptual modelling practices - predictions that subsequently were supported in empirical research we undertook. Finally, I discuss several possibilities and pitfalls I perceived to be associated with our using ontological theories to underpin research on conceptual modelling.
Resumo:
Examples of recent research into adolescent risk behaviors from a variety of disciplines and methodologies, denoting the range of researchers interested in this area and whose interest in communication and language articulates and exemplifies the extent of the field, are surveyed in this article.