804 resultados para 300705 Evaluation of Management Strategies
Resumo:
The challenges posed by global climate change are motivating the investigation of strategies that can reduce the life cycle greenhouse gas (GHG) emissions of products and processes. While new construction materials and technologies have received significant attention, there has been limited emphasis on understanding how construction processes can be best managed to reduce GHG emissions. Unexpected disruptive events tend to adversely impact construction costs and delay project completion. They also tend to increase project GHG emissions. The objective of this paper is to investigate ways in which project GHG emissions can be reduced by appropriate management of disruptive events. First, an empirical analysis of construction data from a specific highway construction project is used to illustrate the impact of unexpected schedule delays in increasing project GHG emissions. Next, a simulation based methodology is described to assess the effectiveness of alternative project management strategies in reducing GHG emissions. The contribution of this paper is that it explicitly considers projects emissions, in addition to cost and project duration, in developing project management strategies. Practical application of the method discussed in this paper will help construction firms reduce their project emissions through strategic project management, and without significant investment in new technology. In effect, this paper lays the foundation for best practices in construction management that will optimize project cost and duration, while minimizing GHG emissions.
Resumo:
Two experiments were conducted to evaluate the effects of body condition scores of beef calves on performance efficiency and carcass characteristics. In Experiment 1, 111 steer calves were stratified by breed and condition score (CS) and randomly allotted to 14 pens. The study was analyzed as a 2 x 3 factorial design, with two breeds (Angus and Simmental) and three initial CS (4.4, 5.1, and 5.6). In Experiment 2, 76 steer calves were allotted to six pens by CS. The resultant pens averaged 3.9, 4.5, 4.7, 5.0, 5.1, and 5.6 in CS. Calves in both studies were fed a corn-based finishing diet formulated to 13.5% crude protein. All calves were implanted with Synovex- SÒ initially and reimplanted with Revalor-SÒ. In Experiment 1, 29-day dry matter intake (lb/day) increased with CS (17.9, 18.1, and 19.1 for 4.4, 5.1, and 5.6, respectively; p < .04). Daily gain (29 days) tended to decrease with increasing CS (4.19, 3.71, and 3.26; p < .13). Days on feed decreased with increasing CS (185, 180, and 178d; p < .07). In Experiment 2, daily gains also increased with decreasing initial CS for the first 114 days (p < .05) and tended to increase overall (p < .20). In Experiment 1, calves with lower initial CS had less external fat at slaughter (.48, .53, and .61 in. for CS 4.4, 5.1, and 5.6, respectively; p < .05). This effect was also noted at slaughter (p < .10), as well as at 57 days (p < .06) and at 148 days (p < .06) as measured by real-time ultrasound. Measurements of intramuscular fat and marbling were not different in either study. These data suggest that CS of feeder calves may be a useful tool for adjusting energy requirements of calves based on body condition. Also, feeder cattle may be sorted into outcome or management groups earlier than currently practiced using body condition and/or real-time ultrasound.
Resumo:
A year-round grazing system for spring- and fall-calving cows was developed to compare animal production and performance, hay production and feeding, winter forage composition changes, and summer pasture yield and nutrient composition to that from a conventional, or minimal land system. Systems compared forage from smooth bromegrass-orchardgrass-birdsfoot trefoil pastures for both systems in the summer and corn crop residues and stockpiled grass-legume pastures for the year-round system to drylot hay feeding during winter for the minimal land system. The year-round grazing system utilized 1.67 acres of smooth bromegrassorchardgrass- birdsfoot trefoil (SB-O-T) pasture per cow in the summer, compared with 3.33 acres of (SB-O-T) pasture per cow in the control (minimal land) system. In addition to SB-O-T pastures, the year-round grazing system utilized 2.5 acres of tall fescue-red clover (TFRC) and 2.5 acres of smooth bromegrass-red clover (SBRC) per cow for grazing in both mid-summer and winter for fall- and spring-calving cows, respectively. First-cutting hay was harvested from the TF-RC and SB-RC pastures, and regrowth was grazed for approximately 45 days in the summer. These pastures were then fertilized with 40 lbs N/acre and stockpiled for winter grazing. Also utilized during the winter for spring-calving cows in the year-round grazing system were corn crop residue (CCR) pastures at an allowance of 2.5 acres per cow. In the minimal land system, hay was harvested from three-fourths of the area in SB-O-T pastures and stored for feeding in a drylot through the winter. Summer grazing was managed with rotational stocking for both systems, and winter grazing of stockpiled forages and corn crop residues by year-round system cows was managed by strip-stocking. Hay was fed to maintain a body condition score of 5 on a 9 point scale for spring-calving cows in both systems. Hay was supplemented as needed to maintain a body condition score of 3 for fall-calving cows nursing calves through the winter. Although initial condition scores for cows in both systems were different at the initiation of grazing for both winter and summer, there were no significant differences (P > .05) in overall condition score changes throughout both grazing seasons. In year 1, fall-calving cows in the year-round grazing system lost more (P < .05) body weight during winter than spring-calving cows in either system. In year 2, there were no differences seen in weight changes over winter for any group of cows. Average daily gains of fall calves in the yearround system were 1.9 lbs/day compared with weight gains of 2.5 lbs/day for spring calves from both systems. Yearly growing animal production from pastures for both years did not differ between systems when weight gains of stockers that grazed summer pastures in the year-round grazing system were added to weight gains of suckling calves. Carcass characteristics for all calves finished in the feedlot for both systems were similar. There were no significant differences in hay production between systems for year 1; however, amounts of hay needed to maintain cows were 923, 1373, 4732 lbs dry matter/cow for year-round fall-calving, year-round spring-calving, and minimal land spring-calving cows, respectively. In year 2, hay production per acre in the minimal land system was greater (P < .05) than for the year-round system, but the amounts of hay required per cow were 0, 0, and 4720 lbs dry matter/cow for yearround fall-calving, year-round spring-calving, and minimal land spring-calving cows, respectively.
Evaluation of control and surveillance strategies for classical swine fever using a simulation model
Resumo:
Classical swine fever (CSF) outbreaks can cause enormous losses in naïve pig populations. How to best minimize the economic damage and number of culled animals caused by CSF is therefore an important research area. The baseline CSF control strategy in the European Union and Switzerland consists of culling all animals in infected herds, movement restrictions for animals, material and people within a given distance to the infected herd and epidemiological tracing of transmission contacts. Additional disease control measures such as pre-emptive culling or vaccination have been recommended based on the results from several simulation models; however, these models were parameterized for areas with high animal densities. The objective of this study was to explore whether pre-emptive culling and emergency vaccination should also be recommended in low- to moderate-density areas such as Switzerland. Additionally, we studied the influence of initial outbreak conditions on outbreak severity to improve the efficiency of disease prevention and surveillance. A spatial, stochastic, individual-animal-based simulation model using all registered Swiss pig premises in 2009 (n=9770) was implemented to quantify these relationships. The model simulates within-herd and between-herd transmission (direct and indirect contacts and local area spread). By varying the four parameters (a) control measures, (b) index herd type (breeding, fattening, weaning or mixed herd), (c) detection delay for secondary cases during an outbreak and (d) contact tracing probability, 112 distinct scenarios were simulated. To assess the impact of scenarios on outbreak severity, daily transmission rates were compared between scenarios. Compared with the baseline strategy (stamping out and movement restrictions) vaccination and pre-emptive culling neither reduced outbreak size nor duration. Outbreaks starting in a herd with weaning piglets or fattening pigs caused higher losses regarding to the number of culled premises and were longer lasting than those starting in the two other index herd types. Similarly, larger transmission rates were estimated for these index herd type outbreaks. A longer detection delay resulted in more culled premises and longer duration and better transmission tracing increased the number of short outbreaks. Based on the simulation results, baseline control strategies seem sufficient to control CSF in low-medium animal-dense areas. Early detection of outbreaks is crucial and risk-based surveillance should be focused on weaning piglet and fattening pig premises.
Resumo:
Foot-and-mouth disease (FMD) is highly contagious and one of the most economically devastating diseases of cloven-hoofed animals. Scientific-based preparedness about how to best control the disease in a previously FMD-free country is therefore essential for veterinary services. The present study used a spatial, stochastic epidemic simulation model to compare the effectiveness of emergency vaccination with conventional (non-vaccination) control measures in Switzerland, a low-livestock density country. Model results revealed that emergency vaccination with a radius of 3 km or 10 km around infected premises (IP) did not significantly reduce either the cumulative herd incidence or epidemic duration if started in a small epidemic situation where the number of IPs is still low. However, in a situation where the epidemic has become extensive, both the cumulative herd incidence and epidemic duration are reduced significantly if vaccination were implemented with a radius of 10 km around IPs. The effect of different levels of conventional strategy measures was also explored for the non-vaccination strategy. It was found that a lower compliance level of farmers for movement restrictions and delayed culling of IPs significantly increased both the cumulative IP incidence and epidemic duration. Contingency management should therefore focus mainly on improving conventional strategies, by increasing disease awareness and communication with stakeholders and preparedness of culling teams in countries with a livestock structure similar to Switzerland; however, emergency vaccination should be considered if there are reasons to believe that the epidemic may become extensive, such as when disease detection has been delayed and many IPs are discovered at the beginning of the epidemic.
Resumo:
The neutral bis ((pivaloyloxy)methyl) (PIV$\sb2\rbrack$ derivatives of FdUMP, ddUMP, and AZTMP were synthesized as potential membrane-permeable prodrugs of FdUMP, ddUMP, and AZTMP. These compounds were designed to enter cells by passive diffusion and revert to the parent nucleotides after removal of the PIV groups by hydrolytic enzymes. These prodrugs were prepared by condensation of FUdR, ddU, and AZT with PIV$\sb2$ phosphate in the presence of triphenylphosphine and diethyl azodicarboxylate (the Mitsunobo reagent). PIV$\sb2$-FdUMP, PIV$\sb2$-ddUMP, and PIV$\sb2$-AZTMP were stable in the pH range 1.0-4.0 (t$\sb{1/2} = {>}$100 h). They were also fairly stable at pH 7.4 (t$\sb{1/2} = {>}$40 h). In 0.05 M NaOH solution, however, they were rapidly degraded (t$\sb{1/2} < 2$ min). In the presence hog liver carboxylate esterase, they were converted quantitatively to the corresponding phosphodiesters, PIV$\sb1$-FdUMP, PIV$\sb1$-ddUMP, and PIV$\sb1$-AZTMP; after 24 h incubation, only trace amounts of FdUMP, ddUMP, and AZTMP (1-5%) were observed indicating that the PIV$\sb1$ compounds were poor substrates for the enzyme. In human plasma, the PIV$\sb2$ compounds were rapidly degraded with half-lives of less than 5 min. The rate of degradation of the PIV$\sb2$ compounds in the presence of phosphodiesterase I was the same as that in buffer controls, indicating that they were not substrates for this enzyme. In the presence of phosphodiesterase I, PIV$\sb1$-FdUMP, PIV$\sb1$-ddUMP, and PIV$\sb1$-AZTMP were converted quantitatively to FdUMP, ddUMP, and AZTMP.^ PIV$\sb2$-ddUMP and PIV$\sb2$-AZTMP were effective at controlling HIV type 1 infection in MT-4 and CEM tk$\sp-$ cells in culture. Mechanistic studies demonstrated that PIV$\sb2$-ddUMP and PIV$\sb2$-AZTMP were taken up by the cells and converted to ddUTP and AZTTP, both potent inhibitors of HIV reverse transcriptase. However, a potential shortcoming of PIV$\sb2$-ddUMP and PIV$\sb2$-AZTMP as clinical therapeutic agents is that they are rapidly degraded (t$\sb{1/2}$ = approx. 4 minutes) in human plasma by carboxylate esterases. To circumvent this limitation, chemically-labile nucleotide prodrugs and liposome-encapsulated nucleotide prodrugs were investigated. In the former approach, the protective groups bis(N, N-(dimethyl)carbamoyloxymethyl) (DM$\sb2$) and bis (N-(piperidino)carbamoyloxymethyl) (DP$\sb2$) were used to synthesize DM$\sb2$-ddUMP and DP$\sb2$-ddUMP, respectively. In aqueous buffers (pH range 1.0-9.0) these compounds were degraded with half-lives of 3 to 4 h. They had similar half-lives in human plasma demonstrating that they were resistant to esterase-mediated cleavage. However, neither compound gave rise to significant concentrations of ddUMP in CEM or CEM tk$\sp-$ cells. In the liposome-encapsulated nucleotide prodrug approach, three different liposomal formulations of PIV$\sb2$-ddUMP (L-PIV$\sb2$-ddUMP) were investigated. The half-lifes of these L-PIV$\sb2$-ddUMP preparations in human plasma were 2 h compared with 4 min for the free drug. The preparations were more effective at controlling HIV-1 infection than free PIV$\sb2$-ddUMP in human T cells in culture. Collectively, these data indicate that PIV$\sb2$-FdUMP, PIV$\sb2$-ddUMP, and PIV$\sb2$-AZTMP are effective membrane-permeable prodrugs of FdUMP, ddUMP, and AZTMP. ^
Resumo:
The Sensor Node Overlay Multicast (SNOMC) protocol supports reliable, time-efficient and energy-efficient dissemination of data from one sender node to multiple receivers as it is needed for configuration, code update, and management operations in wireless sensor networks. SNOMC supports end-to-end reliability using negative acknowledgements. The mechanism is simple and easy to implement and can significantly reduce the number of transmissions. SNOMC supports three different caching strategies namely caching on each intermediate node, caching on branching nodes, or caching on the sender node only. SNOMC was evaluated in our in-house real-world testbed and compared to a number of common data dissemination protocols. It outperforms the selected protocols in terms of transmission time, number of transmitted packets, and energy-consumption.
Resumo:
Salinization is a soil threat that adversely affects ecosystem services and diminishes soil functions in many arid and semi-arid regions. Soil salinity management depends on a range of factors, and can be complex expensive and time demanding. Besides taking no action, possible management strategies include amelioration and adaptation measures. The WOCAT Technologies Questionnaire is a standardized methodology for monitoring, evaluating and documenting sustainable land management practices through interaction with the stakeholders. Here we use WOCAT for the systematic analysis and evaluation of soil salinization amelioration measures, for the RECARE project Case Study in Greece, the Timpaki basin, a semi-arid region in south-central Crete where the main land use is horticulture in greenhouses irrigated by groundwater. Excessive groundwater abstractions have resulted in a drop of the groundwater level in the coastal part of the aquifer, thus leading to seawater intrusion and in turn to soil salinization due to irrigation with brackish water. Amelioration technologies that have already been applied in the case study by the stakeholders are examined and classified depending on the function they promote and/or improve. The documented technologies are evaluated for their impacts on ecosystem services, cost and input requirements. Preliminary results show that technologies which promote maintaining existing crop types while enhancing productivity and decreasing soil salinity such as composting, mulching, rain water harvesting and seed biopriming are preferred by the stakeholders. Further work will include result validation using qualitative approaches.
Resumo:
Soil salinity management can be complex, expensive, and time demanding, especially in arid and semi-arid regions. Besides taking no action, possible management strategies include amelioration and adaptation measures. Here we apply the World Overview of Conservation Approaches and Technologies (WOCAT) framework for the systematic analysis and evaluation and selection of soil salinisation amelioration technologies in close collaboration with stakeholders. The participatory approach is applied in the RECARE (Preventing and Remediating degradation of soils in Europe through Land Care) project case study of Timpaki, a semiarid region in south-central Crete (Greece) where the main land use is horticulture in greenhouses irrigated by groundwater. Excessive groundwater abstractions have resulted in a drop of the groundwater level in the coastal part of the aquifer, thus leading to seawater intrusion and in turn to soil salinisation. The documented technologies are evaluated for their impacts on ecosystem services, cost, and input requirements using a participatory approach and field evaluations. Results show that technologies which promote maintaining existing crop types while enhancing productivity and decreasing soil salinity are preferred by the stakeholders. The evaluation concludes that rainwater harvesting is the optimal solution for direct soil salinity mitigation, as it addresses a wider range of ecosystem and human well-being benefits. Nevertheless, this merit is offset by poor financial motivation making agronomic measures more attractive to users.
Resumo:
OBJECTIVE To determine the success of medical management of presumptive cervical disk herniation in dogs and variables associated with treatment outcome. DESIGN Retrospective case series. ANIMALS Dogs (n=88) with presumptive cervical disk herniation. METHODS Dogs with presumptive cervical and thoracolumbar disk herniation were identified from medical records at 2 clinics and clients were mailed a questionnaire related to the success of therapy, clinical recurrence of signs, and quality of life (QOL) as interpreted by the owner. Signalment, duration and degree of neurologic dysfunction, and medication administration were determined from medical records. RESULTS Ninety-seven percent of dogs (84/87) with complete information were described as ambulatory at initial evaluation. Successful treatment was reported for 48.9% of dogs with 33% having recurrence of clinical signs and 18.1% having therapeutic failure. Bivariable logistic regression showed that non-steroidal anti-inflammatory drug (NSAID) administration was associated with success (P=.035; odds ratio [OR]=2.52). Duration of cage rest and glucocorticoid administration were not significantly associated with success or QOL. Dogs with less-severe neurologic dysfunction were more likely to have a successful outcome (OR=2.56), but this association was not significant (P=.051). CONCLUSIONS Medical management can lead to an acceptable outcome in many dogs with presumptive cervical disk herniation. Based on these data, NSAIDs should be considered as part of the therapeutic regimen. Cage rest duration and glucocorticoid administration do not appear to benefit these dogs, but this should be interpreted cautiously because of the retrospective data collection and use of client self-administered questionnaire follow-up. CLINICAL RELEVANCE These results provide insight into the success of medical management for presumptive cervical disk herniation in dogs and may allow for refinement of treatment protocols.
Resumo:
OBJECTIVE To determine the success of medical management of presumptive thoracolumbar disk herniation in dogs and the variables associated with treatment outcome. STUDY DESIGN Retrospective case series. ANIMALS Dogs (n=223) with presumptive thoracolumbar disk herniation. METHODS Medical records from 2 clinics were used to identify affected dogs, and owners were mailed a questionnaire about success of therapy, recurrence of clinical signs, and quality of life (QOL) as interpreted by the owner. Signalment, duration and degree of neurologic dysfunction, and medication administration were determined from medical records. RESULTS Eighty-three percent of dogs (185/223) were ambulatory at initial evaluation. Successful treatment was reported for 54.7% of dogs, with 30.9% having recurrence of clinical signs and 14.4% classified as therapeutic failures. From bivariable logistic regression, glucocorticoid administration was negatively associated with success (P=.008; odds ratio [OR]=.48) and QOL scores (P=.004; OR=.48). The duration of cage rest was not significantly associated with success or QOL. Nonambulatory dogs were more likely to have lower QOL scores (P=.01; OR=2.34). CONCLUSIONS Medical management can lead to an acceptable outcome in many dogs with presumptive thoracolumbar disk herniation. Cage rest duration does not seem to affect outcome and glucocorticoids may negatively impact success and QOL. The conclusions in this report should be interpreted cautiously because of the retrospective data collection and the use of client self-administered questionnaire follow-up. CLINICAL RELEVANCE These results provide an insight into the success of medical management for presumptive thoracolumbar disk herniation in dogs and may allow for refinement of treatment protocols.
Resumo:
Background. The increasing prevalence of overweight among youth in the United States, and the parallel rise in related medical comorbidities has led to a growing need for efficient weight-management interventions. Purpose. The aim of this study was to evaluate the effects of the Choosing Health and Sensible Exercise (C.H.A.S.E.) childhood obesity prevention program on Body Mass Index (BMI), physical activity and dietary behaviors. Methods. This study utilized de-identified data collected during the fall 2006 session of the C.H.A.S.E. program. A total of 65 students at Woodview Elementary School and Deepwater Elementary School participated in this intervention. The C.H.A.S.E. program is a 10-week obesity prevention program that focuses on nutrition and physical activity education. Collection of height and weight data, and a health behavior survey was conducted during the first and last week of the intervention. Paired t-tests were used to determine statistically significant differences between pre- and post-intervention measurements. One-way analysis of variance was used to adjust for potential confounders, such as gender, age, BMI category ("normal weight", "at risk overweight", or "overweight"), and self-reported weight loss goals. Data were analyzed using STATA, v. 9.2. Results. A significant decrease in mean BMI (p< 0.05) was found after the 10-week intervention. While the results were statistically significant for the group as a whole, changes in BMI were not significant when stratified by age, sex, or ethnicity. The mean overall scores for the behavior survey did not change significantly pre- and post-intervention; however, significant differences were found in the dietary intention scale, indicating that students were more likely to intend to make healthier food choices (p<0.05). No statistically significant decreases in BMI were found when stratified for baseline BMI-for-age percentiles or baseline weight loss efforts (p>0.05). Conclusion. The results of this evaluation provide information that will be useful in planning and implementing an effective childhood obesity intervention in the future. Changes in the self-reported dietary intentions and BMI show that the C.H.A.S.E. program is capable of modifying food choice selection and decreasing BMI. Results from the behavior questionnaire indicate that students in the intervention program were making changes in a positive direction. Future implementation of the C.H.A.S.E. program, as well as other childhood obesity interventions, may want to consider incorporating additional strategies to increase knowledge and other behavioral constructs associated with decreased BMI. In addition, obesity prevention programs may want to increase parental involvement and increase the dose or intensity of the intervention. ^
Resumo:
Genetics education for physicians has been a popular publication topic in the United States and in Europe for over 20 years. Decreasing numbers of medical genetics professionals and an increasing volume of genetic information has created a dire need for increased genetics training in medical school and in clinical practice. This study aimed to assess how well pediatrics-focused primary care physicians apply their general genetics knowledge to clinical genetic testing using scenario-based questions. We chose to specifically focus on knowledge of the diagnostic applicability of Chromosomal Microarray (CMA) technology in pediatrics because of its recent recommendation by the International Standard Cytogenomic Array (ISCA) Consortium as a first-tier genetic test for individuals with developmental disabilities and/or congenital anomalies. Proficiency in ordering baseline genetic testing was evaluated for eighty-one respondents from four pediatrics-focused residencies (categorical pediatrics, pediatric neurology, internal medicine/pediatrics, and family practice) at two large residency programs in Houston, Texas. Similar to other studies, we found an overall deficit of genetic testing knowledge, especially among family practice residents. Interestingly, residents who elected to complete a genetics rotation in medical school scored significantly better than expected, as well as better than residents who did not elect to complete a genetics rotation. We suspect that the insufficient knowledge among physicians regarding a baseline genetics work-up is leading to redundant (i.e. concurrent karyotype and CMA) and incorrect (i.e. ordering CMA to detect achondroplasia) genetic testing and is contributing to rising health care costs in the United States. Our results provide specific teaching points upon which medical schools can focus education about clinical genetic testing and suggest that increased collaboration between primary care physicians and genetics professionals could benefit patient health care overall.
Resumo:
A review of literature related to appointment-keeping served as the basis for the development of an organizational paradigm for the study of appointment-keeping in the Beta-blocker Heart Attack Trial (BHAT). Features of the organizational environment, demographic characteristics of BHAT enrollees, organizational structure and processes and previous organizational performance variables were measured so as to provide exploratory information relating to the appointment-keeping behavior of 3,837 participants enrolled at thirty-two Clinical Centers. Results suggest that the social context of individual behavior is an important consideration for the understanding of patient compliance. In particular, the degree to which previous organizational performance--as measured by obtaining recruitment goals--and the ability to utilize resources had particularly strong bivariate associations with appointment-keeping. Implications for future theory development, research and practical implications were provided as was a suggestion for the development of multidisciplinary research efforts conducted within the context of Centers for the study and application of adherence behaviors. ^