989 resultados para risk-neutral densities
Resumo:
Land application of piggery effluent (containing urine, faeces, water, and wasted feed) is under close scrutiny as a potential source of water resource contamination with phosphorus (P). This paper investigates two case studies of the impact of long-term piggery effluent-P application to soil. A Natrustalf (Sodosol) at P1 has received a net load of 3700 kg effluent P/ha over 19 years. The Haplustalf (Dermosol) selected (P2) has received a net load of 310 000 kg P/ha over 30 years. Total, bicarbonate extractable, and soluble P forms were determined throughout the soil profiles for paired (irrigated and unirrigated) sites at P1 and P2, as well as P sorption and desorption characteristics. Surface bicarbonate (PB, 0 - 0.05 m depth) and dilute CaCl2 extractable molybdate-reactive P (PC) have been significantly elevated by effluent irrigation (P1: PB unirrigated 23±1, irrigated 290±6; PC unirrigated 0.03±0.00, irrigated 23.9±0.2. P2: PB unirrigated 72±48, irrigated 3950±1960; PC unirrigated 0.7±0.0, irrigated 443±287 mg P/kg; mean±s.d.). Phosphorus enrichment to 1.5 m, detected as PB, was observed at P2. Elevated concentrations of CaCl2 extractable organic P forms (POC; estimated by non-molybdate reactive P in centrifuged supernatants) were observed from the soil surface of P1 to a depth of 0.4 m. Despite the extent of effluent application at both of these sites, only P1 displayed evidence of significant accumulation of POC. The increase in surface soil total P (0 - 0.05 m depth) due to effluent irrigation was much greater than laboratory P sorption (>25 times for P1; >57 times for P2) for a comparable range of final solution concentrations (desorption extracts ranged from 1-5 mg P/L for P1 and 50-80 mg P/L for P2). Precipitation of sparingly soluble P phases was evidenced in the soils of the P2 effluent application area.
Resumo:
Aim: The purpose of this study was to determine the percentage of patients assessed as malnourished using the Subjective Global Assessment in two hospitals in Ho Chi Minh City and Can Tho across multiple wards; and to investigate the association with factors including gender, age, days since admission, medical diagnosis and number of medications used. Methods: This cross-sectional study involved 205 inpatients from a hospital in Ho Chi Minh City and 78 inpatients and 89 outpatients from a hospital in Can Tho. Malnutrition status was assessed using Subjective Global Assessment. Ward, gender, age, medical diagnosis, time since admission and medication number were extracted from medical records. Results: 35.6% of inpatients and 9.0% of outpatients were malnourished. Multivariate analysis revealed factors predicting malnutrition status within inpatients (OR (95%CI)) were: age (OR = 1.03 (1.01-1.06)); cancer diagnosis (OR = 34.25 (3.16-370.89)); respiratory ward (11.49 (1.05-125.92)); or general medicine ward (20.34 (2.10-196.88)). Conclusions: Results indicate that malnutrition is a common problem in hospitals in Vietnam. Further research is needed to confirm this finding across a wider range of hospitals and to investigate the feasibility and efficacy of implementation of nutrition interventions in hospital settings.
Resumo:
The rate of severe depression among women in single-parent and biological families and in a variety of stepfamilies was examined in a large community sample of 13,088 pregnant women in the United Kingdom. Compared with women in biological families and published population rates, women in single-parent families and step-families reported significantly elevated rates of depression. Family-type differences in several risk factors were examined, including cohabiting (vs. married) status, relationship history, and socioeconomic and psychosocial risks, such as crowding, social support, and stressful life events. Family-type differences in depression were mediated partly by differences in social support, stressful life events, and crowding, but a main effect of family type in predicting depression remained after statistically controlling for these risks.
Resumo:
The emission from neutral hydrogen (HI) clouds in the post-reionization era (z <= 6), too faint to be individually detected, is present as a diffuse background in all low frequency radio observations below 1420MHz. The angular and frequency fluctuations of this radiation (similar to 1 mK) are an important future probe of the large-scale structures in the Universe. We show that such observations are a very effective probe of the background cosmological model and the perturbed Universe. In our study we focus on the possibility of determining the redshift-space distortion parameter beta, coordinate distance r(nu), and its derivative with redshift r(nu)('). Using reasonable estimates for the observational uncertainties and configurations representative of the ongoing and upcoming radio interferometers, we predict parameter estimation at a precision comparable with supernova Ia observations and galaxy redshift surveys, across a wide range in redshift that is only partially accessed by other probes. Future HI observations of the post-reionization era present a new technique, complementing several existing ones, to probe the expansion history and to elucidate the nature of the dark energy.
Resumo:
Sectors of the forest plantation industry in Australia are set to expand in the near future using species or hybrids of the spotted gums (Corymbia, Section Politaria). Plantations of these taxa have already been introduced across temperate and subtropical Australia, representing locally exotic introductions from native stands in Queensland and New South Wales. A literature review was undertaken to provide insights into the potential for pollen-mediated gene flow from these plantations into native populations. Three factors suggest that such gene flow is likely; (1) interspecific hybridisation within the genus has frequently been recorded, including between distantly related species from different sections, (2) apparent high levels of vertebrate pollinator activity may result in plantation pollen being moved over hundreds of kilometres, (3) much of the plantation estate is being established among closely related taxa and therefore few barriers to gene flow are expected. Across Australia, 20 of the 100 native Corymbia taxa were found to have regional level co-occurrence with plantations. These were located most notably within regions of north-east New South Wales and south-east Queensland, however, co-occurrence was also found in south-west Western Australia and eastern Victoria. The native species found to have co-occurrence were then assessed for the presence of reproductive barriers at each step in the process of gene flow that may reduce the number of species at risk even further. The available data suggest three risk categories exist for Corymbia. The highest risk was for gene flow from plantations of spotted gums to native populations of spotted gums. This was based on the expected limited existence of pre- and post-zygotic barriers, substantial long-distance pollen dispersal and an apparent broad period of flowering in Corymbia citriodora subsp. variegata plantations. The following risk category focussed on gene flow from Corymbia torelliana × C. c. variegata hybrid plantations into native C. c. variegata, as the barriers associated with the production and establishment of F1 hybrids have been circumvented. For the lowest risk category, Corymbia plantations may present a risk to other non-spotted gum species, however, further investigation of the particular cross-combinations is required. A list of research directions is provided to better quantify these risks. Empirical data will need to be combined within a risk assessment framework that will not only estimate the likelihood of exotic gene flow, but also consider the conservation status/value of the native populations. In addition, the potential impacts of pollen flow from plantations will need to be weighed up against their various economic and environmental benefits.
Level of contribution of intrinsic risk factors to the management of patients with plantar heel pain
Resumo:
Introduction: Injuries in the lower extremity are considered to have multifactorial causes, whilst people with heel pain represent the most frequent cause of visits to health professionals. Managing these patients can be very difficult. The purpose of this research is to identify key variables which can influence foot health in patients with heel pain. Materials and method: A cross-sectional observational study was carried out with a sample of sixty-two participants recruited from the Educational Welfare Unit of the University of Malaga. The therapists, blinded for the study, fill in the data with anthropometric information and the FPI, while participants fill in the foot health status questionnaire, FHSQ. The most significant results reveal that there is a moderate relation between the clinical variables and the FHSQ commands. The most significant contribution is the BMI in the foot health status questionnaire. Conclusion: The variables which can help manage clinical subjects with heel pain are age, BMI, footwear and FPI (left foot).
Resumo:
This study evaluates the effectiveness and social implications of home monitoring of 31 infants at risk of sudden infant death syndrome (SIDS). Thirteen siblings of children dying of SIDS, nine near miss SIDS infants and nine preterm infants with apnoea persisting beyond 40 weeks post conceptual age were monitored from a mean age of 15 days to a mean of 10 months. Chest movement detection monitors were used in 27 and thoracic impedance monitors in four. Genuine apnoeic episodes were reported by 21 families, and 13 infants required resuscitation. Apnoeic episodes occurred in all nine preterm infants but in only five (38%) of the siblings of SIDS (P<0.05). Troublesome false alarms were a major problem occurring with 61% of the infants and were more common with the preterm infants than the siblings of SIDS. All but two couples stated that the monitor decreased anxiety and improved their quality of life. Most parents accepted that the social restrictions imposed by the monitor were part of the caring process but four couples were highly resentful of the changes imposed on their lifestyle. The monitors used were far from ideal with malfunction occurring in 17, necessitating replacement in six, repair in six and cessation of monitoring in three. The parents became ingenious in modifying the monitors to their own individual requirements Although none of these 31 ‘at risk’ infants died the study sample was far too small to conclude whether home monitoring prevented any cases of SIDS.
Resumo:
Purpose: It is common for head and neck patients to be affected by time trend errors as a result of weight loss during a course of radiation treatment. The objective of this planning study was to investigate the impact of weight loss on Volumetric Modulated Arc Therapy (VMAT) as well as Intensity modulated radiation therapy (IMRT) for locally advanced head and neck cancer using automatic co-registration of the CBCT. Methods and Materials: A retrospective analysis of previously treated IMRT plans for 10 patients with locally advanced head and neck cancer patients was done. A VMAT plan was also produced for all patients. We calculated the dose–volume histograms (DVH) indices for spinal cord planning at risk volumes (PRVs), the brainstem PRVs (SC+0.5cm and BS+0.5cm, respectively) as well as mean dose to the parotid glands. Results: The results show that the mean difference in dose to the SC+0.5cm was 1.03% and 1.27% for the IMRT and VMAT plans, respectively. As for dose to the BS+0.5, the percentage difference was 0.63% for the IMRT plans and 0.61% for the VMAT plans. The analysis of the parotid gland doses shows that the percentage change in mean dose to left parotid was -8.0% whereas that of the right parotid was -6.4% for the IMRT treatment plans. In the VMAT plans, the percentages change for the left and the right parotid glands were -6.6% and -6.7% respectively. Conclusions: This study shows a clinically significant impact of weight loss on DVH indices analysed in head and neck organs at risk. It highlights the importance of adaptive radiotherapy in head and neck patients if organ at risk sparing is to be maintained.
Resumo:
Objective This prospective longitudinal study aims to determine the risk factors of wandering-related adverse consequences in community-dwelling persons with mild dementia. These adverse consequences include negative outcomes of wandering (falls, fractures, and injuries) and eloping behavior. Methods We recruited 143 dyads of persons with mild dementia and their caregivers from a veteran's hospital and memory clinic in Florida. Wandering-related adverse consequences were measured using the Revised Algase Wandering Scale – Community Version. Variables such as personality (Big Five Inventory), behavioral response to stress, gait, and balance (Tinetti Gait and Balance), wayfinding ability (Wayfinding Effectiveness Scale), and neurocognitive abilities (attention, cognition, memory, language/verbal skills, and executive functioning) were also measured. Bivariate and logistic regression analyses were performed to assess the predictors of these wandering-related adverse consequences. Results A total of 49% of the study participants had falls, fractures, and injuries due to wandering behavior, and 43.7% demonstrated eloping behaviors. Persistent walking (OR = 2.6) and poor gait (OR = 0.9) were significant predictors of negative outcomes of wandering, while persistent walking (OR = 13.2) and passivity (OR = 2.55) predicted eloping behavior. However, there were no correlations between wandering-related adverse consequences and participants' characteristics (age, gender, race, ethnicity, and education), health status (Charlson comorbidity index), or neurocognitive abilities. Conclusion Our results highlight the importance of identifying at-risk individuals so that effective interventions can be developed to reduce or prevent the adverse consequences of wandering.
Resumo:
We consider an enhancement of the credit risk+ model to incorporate correlations between sectors. We model the sector default rates as linear combinations of a common set of independent variables that represent macro-economic variables or risk factors. We also derive the formula for exact VaR contributions at the obligor level.
Resumo:
To examine healthy slaughter-age cattle and sheep on-farm for the excretion of Salmonella serovars in faeces and to identify possible risk factors using a questionnaire. The study involved 215 herds and flocks in the four eastern states of Australia, 56 with prior history of salmonellosis. Production systems examined included pasture beef cattle, feedlot beef cattle, dairy cattle, prime lambs and mutton sheep and animals were all at slaughter age. From each herd or flock, 25 animals were sampled and the samples pooled for Salmonella culture. All Salmonella isolated were serotyped and any Salmonella Typhimurium isolates were phage typed. Questionnaires on each production system, prepared in Epi Info 6.04, were designed to identify risk factors associated with Salmonella spp excretion, with separate questionnaires designed for each production system. Salmonellae were identified in all production systems and were more commonly isolated from dairies and beef feedlots than other systems. Statistical analysis revealed that dairy cattle were significantly more likely to shed Salmonella in faeces than pasture beef cattle, mutton sheep and prime lambs (P < 0.05). A wide diversity of Salmonella serovars, all of which have been isolated from humans in Australia, was identified in both cattle and sheep. Analysis of the questionnaires showed access to new arrivals was a significant risk factor for Salmonella excretion on dairy properties. For beef feedlots, the presence of large numbers of flies in the feedlot pens or around stored manure were significant risk factors for Salmonella excretion. Dairy cattle pose the highest risk of all the slaughter-age animals tested. Some of the identified risk factors can be overcome by improved management practices, especially in relation to hygiene.
Resumo:
Online grocery shopping has enjoyed strong growth and it is predicted this channel will continue to grow exponentially in the coming years. While online shopping has attracted an abundance of research interest, examinations of online grocery shopping behaviour are only now emerging. Shopping online for groceries differs considerably from general online shopping due to the perishability and variability of the product, and frequency of the shopping activity. Two salient gaps underpin this research into online grocery shopping. This study responds to calls to investigate the online shoppers’ experience in the context of online purchasing frequency. Second, this study examines the mediating effect of perceived risk between trust and online repurchase intention of groceries. An online survey was employed to collect data from shoppers who were recruited from a multi-channel grocery e-retailer’s database. The online survey, comprising 16 reflective validated scale items, was sent to 555 frequent and infrequent online grocery shoppers. Results find that while customer satisfaction predicts trust for both infrequent and frequent online grocery shoppers, perceived risk fully mediates the effect of trust on repurchase intentions for infrequent online grocery shoppers. Furthermore path analysis reveals that the developed behavioural model is variant across both groups of shoppers. Theoretically, we provide a deeper understanding of the online customer experience, while gaining insight into two shopper segments identified as being important to grocery e-retailers. For managers, this study tests an online customer behavioural model with actual purchasing behaviour and identifies the continued presence of perceived risk in grocery e-retailing regardless of purchase frequency or experience.
Resumo:
BACKGROUND: Glyphosate-resistant cotton varieties are an important tool for weed control in Australian cotton production systems. To increase the sustainability of this technology and to minimise the likelihood of resistance evolving through its use, weed scientists, together with herbicide regulators, industry representatives and the technology owners, have developed a framework that guides the use of the technology. Central to this framework is a crop management plan (CMP) and grower accreditation course. A simulation model that takes into account the characteristics of the weed species, initial gene frequencies and any associated fitness penalties was developed to ensure that the CMP was sufficiently robust to minimise resistance risks. RESULTS: The simulations showed that, when a combination of weed control options was employed in addition to glyphosate, resistance did not evolve over the 30 year period of the simulation. CONCLUSION: These simulations underline the importance of maintaining an integrated system for weed management to prevent the evolution of glyphosate resistance, prolonging the use of glyphosate-resistant cotton.
Resumo:
Over 1 billion ornamental fish comprising more than 4000 freshwater and 1400 marine species are traded internationally each year, with 8-10 million imported into Australia alone. Compared to other commodities, the pathogens and disease translocation risks associated with this pattern of trade have been poorly documented. The aim of this study was to conduct an appraisal of the effectiveness of risk analysis and quarantine controls as they are applied according to the Sanitary and Phytosanitary (SPS) agreement in Australia. Ornamental fish originate from about 100 countries and hazards are mostly unknown; since 2000 there have been 16-fold fewer scientific publications on ornamental fish disease compared to farmed fish disease, and 470 fewer compared to disease in terrestrial species (cattle). The import quarantine policies of a range of countries were reviewed and classified as stringent or non-stringent based on the levels of pre-border and border controls. Australia has a stringent policy which includes pre-border health certification and a mandatory quarantine period at border of 1-3 weeks in registered quarantine premises supervised by government quarantine staff. Despite these measures there have been many disease incursions as well as establishment of significant exotic viral, bacterial, fungal, protozoal and metazoan pathogens from ornamental fish in farmed native Australian fish and free-living introduced species. Recent examples include Megalocytivirus and Aeromonas salmonicida atypical strain. In 2006, there were 22 species of alien ornamental fish with established breeding populations in waterways in Australia and freshwater plants and molluscs have also been introduced, proving a direct transmission pathway for establishment of pathogens in native fish species. Australia's stringent quarantine policies for imported ornamental fish are based on import risk analysis under the SPS agreement but have not provided an acceptable level of protection (ALOP) consistent with government objectives to prevent introduction of pests and diseases, promote development of future aquaculture industries or maintain biodiversity. It is concluded that the risk analysis process described by the Office International des Epizooties under the SPS agreement cannot be used in a meaningful way for current patterns of ornamental fish trade. Transboundary disease incursions will continue and exotic pathogens will become established in new regions as a result of the ornamental fish trade, and this will be an international phenomenon. Ornamental fish represent a special case in live animal trade where OIE guidelines for risk analysis need to be revised. Alternatively, for countries such as Australia with implied very high ALOP, the number of species traded and the number of sources permitted need to be dramatically reduced to facilitate hazard identification, risk assessment and import quarantine controls. Lead papers of the eleventh symposium of the International Society for Veterinary Epidemiology and Economics (ISVEE), Cairns, Australia
Resumo:
Recent incidents of mycotoxin contamination (particularly aflatoxins and fumonisins) have demonstrated a need for an industry-wide management system to ensure Australian maize meets the requirements of all domestic users and export markets. Results of recent surveys are presented, demonstrating overall good conformity with nationally accepted industry marketing standards but with occasional samples exceeding these levels. This paper describes mycotoxin-related hazards inherent in the Australian maize production system and a methodology combining good agricultural practices and the hazard analysis critical control point framework to manage risk.