916 resultados para Wetness duration
Resumo:
Traffic incidents are key contributors to non-recurrent congestion, potentially generating significant delay. Factors that influence the duration of incidents are important to understand so that effective mitigation strategies can be implemented. To identify and quantify the effects of influential factors, a methodology for studying total incident duration based on historical data from an ‘integrated database’ is proposed. Incident duration models are developed using a selected freeway segment in the Southeast Queensland, Australia network. The models include incident detection and recovery time as components of incident duration. A hazard-based duration modelling approach is applied to model incident duration as a function of a variety of factors that influence traffic incident duration. Parametric accelerated failure time survival models are developed to capture heterogeneity as a function of explanatory variables, with both fixed and random parameters specifications. The analysis reveals that factors affecting incident duration include incident characteristics (severity, type, injury, medical requirements, etc.), infrastructure characteristics (roadway shoulder availability), time of day, and traffic characteristics. The results indicate that event type durations are uniquely different, thus requiring different responses to effectively clear them. Furthermore, the results highlight the presence of unobserved incident duration heterogeneity as captured by the random parameter models, suggesting that additional factors need to be considered in future modelling efforts.
Resumo:
Background Despite the remarkable activity of artemisinin and its derivatives, monotherapy with these agents has been associated with high rates of recrudescence. The temporary arrest of the growth of ring-stage parasites (dormancy) after exposure to artemisinin drugs provides a plausible explanation for this phenomenon. Methods Ring-stage parasites of several Plasmodium falciparum lines were exposed to different doses of dihydroartemisinin (DHA) alone or in combination with mefloquine. For each regime, the proportion of recovering parasites was determined daily for 20 days. Results Parasite development was abruptly arrested after a single exposure to DHA, with some parasites being dormant for up to 20 days. Approximately 50% of dormant parasites recovered to resume growth within the first 9 days. The overall proportion of parasites recovering was dose dependent, with recovery rates ranging from 0.044% to 1.313%. Repeated treatment with DHA or with DHA in combination with mefloquine led to a delay in recovery and an ∼10-fold reduction in total recovery. Strains with different genetic backgrounds appeared to vary in their capacity to recover. Conclusions These results imply that artemisinin-induced arrest of growth occurs readily in laboratory-treated parasites and may be a key factor in P. falciparum malaria treatment failure.
Resumo:
Braking is a crucial driving task with a direct relationship with crash risk, as both excess and inadequate braking can lead to collisions. The objective of this study was to compare the braking profile of young drivers distracted by mobile phone conversations to non-distracted braking. In particular, the braking behaviour of drivers in response to a pedestrian entering a zebra crossing was examined using the CARRS-Q Advanced Driving Simulator. Thirty-two licensed drivers drove the simulator in three phone conditions: baseline (no phone conversation), hands-free, and handheld. In addition to driving the simulator, each participant completed questionnaires related to driver demographics, driving history, usage of mobile phones while driving, and general mobile phone usage history. The drivers were 18–26 years old and split evenly by gender. A linear mixed model analysis of braking profiles along the roadway before the pedestrian crossing revealed comparatively increased decelerations among distracted drivers, particularly during the initial 20 kph of deceleration. Drivers’ initial 20 kph deceleration time was modelled using a parametric accelerated failure time (AFT) hazard-based duration model with a Weibull distribution with clustered heterogeneity to account for the repeated measures experiment design. Factors found to significantly influence the braking task included vehicle dynamics variables like initial speed and maximum deceleration, phone condition, and driver-specific variables such as licence type, crash involvement history, and self-reported experience of using a mobile phone whilst driving. Distracted drivers on average appear to reduce the speed of their vehicle faster and more abruptly than non-distracted drivers, exhibiting excess braking comparatively and revealing perhaps risk compensation. The braking appears to be more aggressive for distracted drivers with provisional licenses compared to drivers with open licenses. Abrupt or excessive braking by distracted drivers might pose significant safety concerns to following vehicles in a traffic stream.
Resumo:
This study was designed to identify the neural networks underlying automatic auditory deviance detection in 10 healthy subjects using functional magnetic resonance imaging. We measured blood oxygenation level-dependent contrasts derived from the comparison of blocks of stimuli presented as a series of standard tones (50 ms duration) alone versus blocks that contained rare duration-deviant tones (100 ms) that were interspersed among a series of frequent standard tones while subjects were watching a silent movie. Possible effects of scanner noise were assessed by a “no tone” condition. In line with previous positron emission tomography and EEG source modeling studies, we found temporal lobe and prefrontal cortical activation that was associated with auditory duration mismatch processing. Data were also analyzed employing an event-related hemodynamic response model, which confirmed activation in response to duration-deviant tones bilaterally in the superior temporal gyrus and prefrontally in the right inferior and middle frontal gyri. In line with previous electrophysiological reports, mismatch activation of these brain regions was significantly correlated with age. These findings suggest a close relationship of the event-related hemodynamic response pattern with the corresponding electrophysiological activity underlying the event-related “mismatch negativity” potential, a putative measure of auditory sensory memory.
Resumo:
It is commonly accepted that regular moderate intensity physical activity reduces the risk of developing many diseases. Counter intuitively, however, evidence also exists for oxidative stress resulting from acute and strenuous exercise. Enhanced formation of reactive oxygen and nitrogen species may lead to oxidatively modified lipids, proteins and nucleic acids and possibly disease. Currently, only a few studies have investigated the influence of exercise on DNA stability and damage with conflicting results, small study groups and the use of different sample matrices or methods and result units. This is the first review to address the effect of exercise of various intensities and durations on DNA stability, focusing on human population studies. Furthermore, this article describes the principles and limitations of commonly used methods for the assessment of oxidatively modified DNA and DNA stability. This review is structured according to the type of exercise conducted (field or laboratory based) and the intensity performed (i.e. competitive ultra/endurance exercise or maximal tests until exhaustion). The findings presented here suggest that competitive ultra-endurance exercise (>4h) does not induce persistent DNA damage. However, when considering the effects of endurance exercise (<4h), no clear conclusions could be drawn. Laboratory studies have shown equivocal results (increased or no oxidative stress) after endurance or exhaustive exercise. To clarify which components of exercise participation (i.e. duration, intensity and training status of subjects) have an impact on DNA stability and damage, additional carefully designed studies combining the measurement of DNA damage, gene expression and DNA repair mechanisms before, during and after exercise of differing intensities and durations are required.
Resumo:
Traffic congestion has been a growing issue in many metropolitan areas during recent years, which necessitates the identification of its key contributors and development of sustainable strategies to help decrease its adverse impacts on traffic networks. Road incidents generally and crashes specifically have been acknowledged as the cause of a large proportion of travel delays in urban areas and account for 25% to 60% of traffic congestion on motorways. Identifying the critical determinants of travel delays has been of significant importance to the incident management systems which constantly collect and store the incident duration data. This study investigates the individual and simultaneous differential effects of the relevant determinants on motorway crash duration probabilities. In particular, it applies parametric Accelerated Failure Time (AFT) hazard-based models to develop in-depth insights into how the crash-specific characteristic and the associated temporal and infrastructural determinants impact the duration. AFT models with both fixed and random parameters have been calibrated on one year of traffic crash records from two major Australian motorways in South East Queensland and the differential effects of determinants on crash survival functions have been studied on these two motorways individually. A comprehensive spectrum of commonly used parametric fixed parameter AFT models, including generalized gamma and generalized F families, have been compared to random parameter AFT structures in terms of goodness of fit to the duration data and as a result, the random parameter Weibull AFT model has been selected as the most appropriate model. Significant determinants of motorway crash duration included traffic diversion requirement, crash injury type, number and type of vehicles involved in a crash, day of week and time of day, towing support requirement and damage to the infrastructure. A major finding of this research is that the motorways under study are significantly different in terms of crash durations; such that motorway exhibits durations that are on average 19% shorter compared to the durations on motorway. The differential effects of explanatory variables on crash durations are also different on the two motorways. The detailed presented analysis confirms that, looking at the motorway network as a whole, neglecting the individual differences between roads, can lead to erroneous interpretations of duration and inefficient strategies for mitigating travel delays along a particular motorway.
Resumo:
Background Longer breastfeeding duration appears to have a protective effect against childhood obesity. This effect may be partially mediated by maternal feeding practices during the first years of life. However, the few studies that have examined links between breastfeeding duration and subsequent feeding practices have yielded conflicting results. Objective Using a large sample of first-time mothers and a newly validated, comprehensive measure of maternal feeding (the Feeding Practices and Structure Questionnaire1), this study examined associations between breastfeeding duration and maternal feeding practices at child age 24 months. Methods Mothers (n = 458) enrolled in the NOURISH trial2 provided data on breastfeeding at child age 4, 14 and 24 months, and on feeding practices at 24 months. Structural Equation Modelling was used to examine associations between breastfeeding duration and five non-responsive and four structure-related ‘authoritative’ feeding practices, adjusting for a range of maternal and child characteristics. Results The model showed acceptable fit (χ2/df = 1.68; RMSEA = .04, CFI = .91 and TLI = .89) and longer breastfeeding duration was negatively associated with four out of five non-responsive feeding practices and positively associated with three out of four structure-related feeding practices. Overall, these results suggest that mothers who breastfeed longer reported using more appropriate feeding practices. Conclusion These data demonstrate an association between longer breastfeeding duration and authoritative feeding practices characterised by responsiveness and structure, which may partly account for the apparent protective effect of breastfeeding on childhood obesity.
Resumo:
Soil nitrogen (N) supply in the Vertosols of southern Queensland, Australia has steadily declined as a result of long-term cereal cropping without N fertiliser application or rotations with legumes. Nitrogen-fixing legumes such as lucerne may enhance soil N supply and therefore could be used in lucerne-wheat rotations. However, lucerne leys in this subtropical environment can create a soil moisture deficit, which may persist for a number of seasons. Therefore, we evaluated the effect of varying the duration of a lucerne ley (for up to 4 years) on soil N increase, N supply to wheat, soil water changes, wheat yields and wheat protein on a fertility-depleted Vertosol in a field experiment between 1989 and 1996 at Warra (26degrees 47'S, 150degrees53'E), southern Queensland. The experiment consisted of a wheat-wheat rotation, and 8 treatments of lucerne leys starting in 1989 (phase 1) or 1990 (phase 2) for 1,2,3 or 4 years duration, followed by wheat cropping. Lucerne DM yield and N yield increased with increasing duration of lucerne leys. Soil N increased over time following 2 years of lucerne but there was no further significant increase after 3 or 4 years of lucerne ley. Soil nitrate concentrations increased significantly with all lucerne leys and moved progressively downward in the soil profile from 1992 to 1995. Soil water, especially at 0.9-1.2 m depth, remained significantly lower for the next 3 years after the termination of the 4 year lucerne ley than under continuous wheat. No significant increase in wheat yields was observed from 1992 to 1995, irrespective of the lucerne ley. However, wheat grain protein concentrations were significantly higher under lucerne-wheat than under wheat wheat rotations for 3-5 years. The lucerne yield and soil water and nitrate-N concentrations were satisfactorily simulated with the APSIM model. Although significant N accretion occurred in the soil following lucerne leys, in drier seasons, recharge of the drier soil profile following long duration lucerne occurred after 3 years. Consequently, 3- and 4-year lucerne-wheat rotations resulted in more variable wheat yields than wheat-wheat rotations in this region. The remaining challenge in using lucerne-wheat rotations is balancing the N accretion benefits with plant-available water deficits, which are most likely to occur in the highly variable rainfall conditions of this region.
Resumo:
Sodium cyanide poison is potentially a more humane method to control wild dogs than sodium fluoroacetate (1080) poison. This study quantified the clinical signs and duration of cyanide toxicosis delivered by the M-44 ejector. The device delivered a nominal 0.88 g of sodium cyanide, which caused the animal to loose the menace reflex in a mean of 43 s, and the animal was assumed to have undergone cerebral hypoxia after the last visible breath. The mean time to cerebral hypoxia was 156 s for a vertical pull and 434 s for a side pull. The difference was possibly because some cyanide may be lost in a side pull. There were three distinct phases of cyanide toxicosis: the initial phase was characterised by head shaking, panting and salivation; the immobilisation phase by incontinence, ataxia and loss of the righting reflex; and the cerebral hypoxia phase by a tetanic seizure. Clinical signs that were exhibited in more than one phase of cyanide toxicosis included retching, agonal breathing, vocalisation, vomiting, altered levels of ocular reflex, leg paddling, tonic muscular spasms, respiratory distress and muscle fasciculations of the muzzle.
Resumo:
1. Weed eradication efforts often must be sustained for long periods owing to the existence of persistent seed banks, among other factors. Decision makers need to consider both the amount of investment required and the period over which investment must be maintained when determining whether to commit to (or continue) an eradication programme. However, a basis for estimating eradication programme duration based on simple data has been lacking. Here, we present a stochastic dynamic model that can provide such estimates. 2. The model is based upon the rates of progression of infestations from the active to the monitoring state (i.e. no plants detected for at least 12 months), rates of reversion of infestations from monitoring to the active state and the frequency distribution of time since last detection for all infestations. Isoquants that illustrate the combinations of progression and reversion parameters corresponding to eradication within different time frames are generated. 3. The model is applied to ongoing eradication programmes targeting branched broomrape Orobanche ramosa and chromolaena Chromolaena odorata. The minimum periods in which eradication could potentially be achieved were 22 and 23 years, respectively. On the basis of programme performance until 2008, however, eradication is predicted to take considerably longer for both species (on average, 62 and 248 years, respectively). Performance of the branched broomrape programme could be best improved through reducing rates of reversion to the active state; for chromolaena, boosting rates of progression to the monitoring state is more important. 4. Synthesis and applications. Our model for estimating weed eradication programme duration, which captures critical transitions between a limited number of states, is readily applicable to any weed.Aparticular strength of the method lies in its minimal data requirements. These comprise estimates of maximum seed persistence and infested area, plus consistent annual records of the detection (or otherwise) of the weed in each infestation. This work provides a framework for identifying where improvements in management are needed and a basis for testing the effectiveness of alternative tactics. If adopted, our approach should help improve decision making with regard to eradication as a management strategy.
Resumo:
Two prerequisites for realistically embarking upon an eradication programme are that cost-benefit analysis favours this strategy over other management options and that sufficient resources are available to carry the programme through to completion. These are not independent criteria, but it is our view that too little attention has been paid to estimating the investment required to complete weed eradication programmes. We deal with this problem by using a two-pronged approach: 1) developing a stochastic dynamic model that provides an estimation of programme duration; and 2) estimating the inputs required to delimit a weed incursion and to prevent weed reproduction over a sufficiently long period to allow extirpation of all infestations. The model is built upon relationships that capture the time-related detection of new infested areas, rates of progression of infestations from the active to the monitoring stage, rates of reversion of infestations from the monitoring to active stage, and the frequency distribution of time since last detection for all infestations. This approach is applied to the branched broomrape (Orobanche ramosa) eradication programme currently underway in South Australia. This programme commenced in 1999 and currently 7450 ha are known to be infested with the weed. To date none of the infestations have been eradicated. Given recent (2008) levels of investment and current eradication methods, model predictions are that it would take, on average, an additional 73 years to eradicate this weed at an average additional cost (NPV) of $AU67.9m. When the model was run for circumstances in 2003 and 2006, the average programme duration and total cost (NPV) were predicted to be 159 and 94 years, and $AU91.3m and $AU72.3m, respectively. The reduction in estimated programme length and cost may represent progress towards the eradication objective, although eradication of this species still remains a long term prospect.
Resumo:
Dry seeding of aman rice can facilitate timely crop establishment and early harvest and thus help to alleviate the monga (hunger) period in the High Ganges Flood Plain of Bangladesh. Dry seeding also offers many other potential benefits, including reduced cost of crop establishment and improved soil structure for crops grown in rotation with rice. However, the optimum time for seeding in areas where farmers have access to water for supplementary irrigation has not been determined. We hypothesized that earlier sowing is safer, and that increasing seed rate mitigates the adverse effects of significant rain after sowing on establishment and crop performance. To test these hypotheses, we analyzed long term rainfall data, and conducted field experiments on the effects of sowing date (target dates of 25 May, 10 June, 25 June, and 10 July) and seed rate (20, 40, and 60 kg ha−1) on crop establishment, growth, and yield of dry seeded Binadhan-7 (short duration, 110–120 d) during the 2012 and 2013 rainy seasons. Wet soil as a result of untimely rainfall usually prevented sowing on the last two target dates in both years, but not on the first two dates. Rainfall analysis also suggested a high probability of being able to dry seed in late May/early June, and a low probability of being able to dry seed in late June/early July. Delaying sowing from 25 May/10 June to late June/early July usually resulted in 20–25% lower plant density and lower uniformity of the plant stand as a result of rain shortly after sowing. Delaying sowing also reduced crop duration, and tillering or biomass production when using a low seed rate. For the late June/early July sowings, there was a strong positive relationship between plant density and yield, but this was not the case for earlier sowings. Thus, increasing seed rate compensated for the adverse effect of untimely rains after sowing on plant density and the shorter growth duration of the late sown crops. The results indicate that in this region, the optimum date for sowing dry seeded rice is late May to early June with a seed rate of 40 kg ha−1. Planting can be delayed to late June/early July with no yield loss using a seed rate of 60 kg ha−1, but in many years, the soil is simply too wet to be able to dry seed at this time due to rainfall.
Resumo:
Cultural practices alter patterns of crop growth and can modify dynamics of weed-crop competition, and hence need to be investigated to evolve sustainable weed management in dry-seeded rice (DSR). Studies on weed dynamics in DSR sown at different times under two tillage systems were conducted at the Agronomic Research Farm, University of Agriculture, Faisalabad, Pakistan. A commonly grown fine rice cultivar 'Super Basmati' was sown on 15th June and 7th July of 2010 and 2011 under zero-till (ZT) and conventional tillage (CONT) and it was subjected to different durations of weed competition [10, 20, 30, 40, and 50 days after sowing (DAS) and season-long competition]. Weed-free plots were maintained under each tillage system and sowing time for comparison. Grassy weeds were higher under ZT while CONT had higher relative proportion of broad-leaved weeds in terms of density and biomass. Density of sedges was higher by 175% in the crop sown on the 7th July than on the 15th June. Delaying sowing time of DSR from mid June to the first week of July reduced weed density by 69 and 43% but their biomass remained unaffected. Tillage systems had no effect on total weed biomass. Plots subjected to season-long weed competition had mostly grasses while broad-leaved weeds were not observed at harvest. In the second year of study, dominance of grassy weeds was increased under both tillage systems and sowing times. Significantly less biomass (48%) of grassy weeds was observed under CONT than ZT in 2010; however, during 2011, this effect was non-significant. Trianthema portulacastrum and Dactyloctenium aegyptium were the dominant broad-leaved and grassy weeds, respectively. Cyperus rotundus was the dominant sedge weed, especially in the crop sown on the 7th July. Relative yield loss (RYL) ranged from 3 to 13% and 7 to16% when weeds were allowed to compete only for 20 DAS. Under season-long weed competition, RYL ranged from 68 to 77% in 2010 and 74 to80% in 2011. The sowing time of 15th June was effective in minimizing weed proliferation and rectifying yield penalty associated with the 7th July sowing. The results suggest that DSR in Pakistan should preferably be sown on 15th June under CONT systems and weeds must be controlled before 20 DAS to avoid yield losses. Successful adoption of DSR at growers' fields in Pakistan will depend on whether growers can control weeds and prevent shifts in weed population from intractable weeds to more difficult-to-control weeds as a consequence of DSR adoption.
Resumo:
The detailed molecular mechanisms underlying the regulation of sleep duration in mammals are still elusive. To address this challenge, we constructed a simple computational model, which recapitulates the electrophysiological characteristics of the slow-wave sleep and awake states. Comprehensive bifurcation analysis predicted that a Ca2+-dependent hyperpolarization pathway may play a role in slow-wave sleep and hence in the regulation of sleep duration. To experimentally validate the prediction, we generate and analyze 21 KO mice. Here we found that impaired Ca2+-dependent K+ channels (Kcnn2 and Kcnn3), voltage-gated Ca2+ channels (Cacna1g and Cacna1h), or Ca2+/calmodulin-dependent kinases (Camk2a and Camk2b) decrease sleep duration, while impaired plasma membrane Ca2+ ATPase (Atp2b3) increases sleep duration. Pharmacological intervention and whole-brain imaging validated that impaired NMDA receptors reduce sleep duration and directly increase the excitability of cells. Based on these results, we propose a hypothesis that a Ca2+-dependent hyperpolarization pathway underlies the regulation of sleep duration in mammals.