885 resultados para Project 2005-001-C : Delivery and Management of Built Assets
Resumo:
This paper proposes a more profound discussion of the philosophical underpins of sustainability than currently exists in the MOT literature and considers their influence on the construction of the theories on green operations and technology management. Ultimately, it also debates the link between theory and practice on this subject area. The paper is derived from insights gained in three research projects completed during the past twelve years, primarily involving the first author. From 2000 to 2002, an investigation using scenario analysis, aimed at reducing atmospheric pollution in urban centres by substituting natural gas for petrol and diesel, provided the first set of insights about public policy, environmental impacts, investment analysis, and technological feasibility. The second research project, from 2003 to 2005, using a survey questionnaire, was aimed at improving environmental performance in livestock farming and explored the issues of green supply chain scope, environmental strategy and priorities. Finally, the third project, from 2006 to 2011, investigated environmental decisions in manufacturing organisations through case study research and examined the underlying sustainability drivers and decision-making processes. By integrating the findings and conclusions from these projects, the link between philosophy, theory, and practice of green operations and technology management is debated. The findings from all these studies show that the philosophical debate seems to have little influence on theory building so far. For instance, although ‘sustainable development’ emphasises ‘meeting the needs of current and future generation’, no theory links essentiality and environmental impacts. Likewise, there is a weak link between theory and the practical issues of green operations and technology management. For example, the well-known ‘life-cycle analysis’ has little application in many cases because the life cycle of products these days is dispersed within global production and consumption systems and there are different stakeholders for each life cycle stage. The results from this paper are relevant to public policy making and corporate environmental strategy and decision making. Most of the past and current studies in the subject of green operations and sustainability management deal with only a single sustainability dimension at any one time. Here the value and originality of this paper lies in its integration between philosophy, theory, and practice of green technology and operations management.
Resumo:
We report the comparative proteomic and antivenomic characterization of the venoms of subspecies cascavella and collilineatus of the Brazilian tropical rattlesnake Crotalus durissus. The venom proteomes of C. d. collilineatus and C. d. cascavella comprise proteins in the range of 4-115 kDa belonging to 9 and 8 toxin families, respectively. Collilineatus and cascavella venoms contain 20-25 main toxins belonging to the following protein families: disintegrin, PLA(2), serine proteinase, cysteine-rich secretory protein (CRISP), vascular endothelial growth factor-like (VEGF), L-amino acid oxidase, C-type lectin-like, and snake venom metalloproteinase (SVMP). As judged by reverse-phase HPLC and mass spectrometry, cascavella and collilineatus share about 90% of their venom proteome. However, the relative occurrence of the toxin families departs among the two C. durissus subspecies venoms. The most notable difference is the presence of the myotoxin crotamine in some C. d. collilineatus specimens (averaging 20.8% of the total proteins of pooled venom), which is absent in the venom of C. d. cascavella. On the other hand, the neurotoxic PLA2 crotoxin represents the most abundant protein in both C. durissus venoms, comprising 67.4% of the toxin proteome in C. d. collilineatus and 72.5% in C. d. cascavella. Myotoxic PLA(2)s are also present in the two venoms albeit in different relative concentrations (18.1% in C. d. cascavella vs. 4.6% in C. d. collilineatus). The venom composition accounts for the clinical manifestations caused by C. durissus envenomations: systemic neurotoxicity and myalgic symptoms and coagulation disturbances, frequently accompanied by myoglobinuria and acute renal failure. The overall compositions of C. d. subspecies cascavella and collilineatus venoms closely resemble that of C. d. terrificus, supporting the view that these taxa can be considered geographical variations of the same species. Pooled venom from adult C.d. cascavella and neonate C.d. terrificus lack crotamine, whereas this skeletal muscle cell membrane depolarizing inducing myotoxin accounts for similar to 20% of the total toxins of venom pooled from C.d. collilineatus and C.d. terrificus from Southern Brazil. The possible relevance of the observed venom variability among the tropical rattlesnake subspecies was assessed by antivenomics using anti-crotalic antivenoms produced at Instituto Butantan and Instituto Vital Brazil. The results revealed that both antivenoms exhibit impaired immunoreactivity towards crotamine and display restricted (similar to 60%) recognition of PLA(2) molecules (crotoxin and D49-myotoxins) from C. d. cascavella and C. d. terrificus venoms. This poor reactivity of the antivenoms may be due to a combination of factors: on the one hand, an inappropriate choice of the mixture of venoms for immunization and, on the other hand, the documented low immunogenicity of PLA(2) molecules. C. durissus causes most of the lethal snakebite accidents in Brazil. The implication of the geographic variation of venom composition for the treatment of bites by different C. durissus subspecies populations is discussed. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
This book chapter represents a synthesis of the work which started in my PhD and which has been the conceptual basis for all of my research since 1993. The chapter presents a method for scientists and managers to use for selecting the type of remotely sensed data to use to meet their information needs associated with a mapping, monitoring or modelling application. The work draws on results from several of my ARC projects, CRC Rainforest and Coastal projects and theses of P.Scarth , K.Joyce and C.Roelfsema.
Resumo:
Solid-state C-13 nuclear magnetic resonance (NMR) with cross-polarisation (CP) and magic-angle-spinning (MAS) was used to: (a) examine the changes in carbon (C) composition of windrowed harvest residues during the first 3 years of hoop pine plantations in subtropical Australia; (b) assess the impacts of windrowed harvest residues on soil organic matter (SOM) composition and quality in the 0-10 cm soil layer. Harvest residues were collected from 0-, 1-, 2- and 3-year-old windrows of ca. 2.5 m width (15 m apart for 0-, 1- and 2-year-old sites and 10 m apart for 3-year-old site). Soils from the 0 to 10 cm soil layer were collected from the 1-, 2- and 3-year-old sites. The 13C NMR spectra of the harvest residues indicated the presence of lignin in the hoop pine wood, foliage and newly incorporated organic matter (NIOM). Condensed tannin structures were found in the decay-resistant bark, small wood and foliage, but were absent in other residue components and SOM. The NMR spectra of small wood samples contained condensed tannin structures because the outer layer of bark was not removed. NIOM showed a shift from foliage-like structures (celluloses) to lignin-type structures, indicating an incorporation of woody residues from the decomposing harvest residues. Suberins were also present in the small wood, foliage and bark. The 13C CP NMR spectra of SOM indicated that in areas where windrows were present, SOM did not show compositional changes. However, an increase in SOM quality under the windrows in the second year after their formation as characterised by the alkyl C/O-alkyl C (A/O-A) ratio was mainly due to inputs from the decomposition of the labile, readily available components of the windrowed harvest residues. (C) 2002 Published by Elsevier Science B.V.
Resumo:
The evolution of new technology and its increasing use, have for some years been making the existence of informal learning more and more transparent, especially among young and older adults in both Higher Education and workplace contexts. However, the nature of formal and non-formal, course-based, approaches to learning has made it hard to accommodate these informal processes satisfactorily, and although technology bring us near to the solution, it has not yet achieved. TRAILER project aims to address this problem by developing a tool for the management of competences and skills acquired through informal learning experiences, both from the perspective of the user and the institution or company. This paper describes the research and development main lines of this project.
Resumo:
Hepatitis C virus (HCV) and human immunodeficiency virus (HIV) share the same transmission mechanisms. The prevalence of HCV in the HIV-infected population varies from region to region, throughout the world, depending on different exposure factors to both viruses. Co-infection with HIV accelerates the progression of the disease caused by HCV, appears to worsen the progression of the HIV infection and increases HCV transmission. Therefore, clinical management and treatment of HCV is a priority in medical facilities that receive HIV-infected patients. Clinical management of these patients involves specific diagnostic procedures and appropriately trained medical staff. The indication of treatment should meet specific clinical and laboratory criteria. There are a number of drugs currently available to treat hepatitis C in co-infected patients.
Resumo:
This paper presents a framework of competences developed for Industrial Engineering and Management that can be used as a tool for curriculum analysis and design, including the teaching and learning processes as well as the alignment of the curriculum with the professional profile. The framework was applied to the Industrial Engineering and Management program at University of Minho (UMinho), Portugal, and it provides an overview of the connection between IEM knowledge areas and the competences defined in its curriculum. The framework of competences was developed through a process of analysis using a combination of methods and sources for data collection. The framework was developed according to four main steps: 1) characterization of IEM knowledge areas; 2) definition of IEM competences; 3) survey; 4) application of the framework at the IEM curriculum. The findings showed that the framework is useful to build an integrated vision of the curriculum. The most visible aspect in the learning outcomes of IEM program is the lack of balance between technical and transversal competences. There was not almost any reference to the transversal competences and it is fundamentally concentrated on Project-Based Learning courses. The framework presented in this paper provides a contribution to the definition of IEM professional profile through a set of competences which need to be explored further. In addition, it may be a relevant tool for IEM curriculum analysis and a contribution for bridging the gap between universities and companies.
Resumo:
4
Resumo:
The State of Iowa currently has approximately 69,000 miles of unpaved secondary roads. Due to the low traffic count on these unpaved o nts as ng e two dust ed d roads, paving with asphalt or Portland cement concrete is not economical. Therefore to reduce dust production, the use of dust suppressants has been utilized for decades. This study was conducted to evaluate the effectiveness of several widely used dust suppressants through quantitative field testing on two of Iowa’s most widely used secondary road surface treatments: crushed limestone rock and alluvial sand/gravel. These commercially available dust suppressants included: lignin sulfonate, calcium chloride, and soybean oil soapstock. These suppressants were applied to 1000 ft test sections on four unpaved roads in Story County, Iowa. Tduplicate field conditions, the suppressants were applied as a surface spray once in early June and again in late August or early September. The four unpaved roads included two with crushed limestone rock and two with alluvial sand/gravel surface treatmewell as high and low traffic counts. The effectiveness of the dust suppressants was evaluated by comparing the dust produced on treated and untreated test sections. Dust collection was scheduled for 1, 2, 4, 6, and 8 weeks after each application, for a total testiperiod of 16 weeks. Results of a cost analysis between annual dust suppressant application and biennial aggregate replacement indicated that the cost of the dust suppressant, its transportation, and application were relatively high when compared to that of thaggregate types. Therefore, the biennial aggregate replacement is considered more economical than annual dust suppressant application, although the application of annual dust suppressant reduced the cost of road maintenance by 75 %. Results of thecollection indicated that the lignin sulfonate suppressant outperformed calcium chloride and soybean oil soapstock on all four unpavroads, the effect of the suppressants on the alluvial sand/gravel surface treatment was less than that on the crushed limestone rock, the residual effects of all the products seem reasonably well after blading, and the combination of alluvial sand/gravel surface treatment anhigh traffic count caused dust reduction to decrease dramatically.
Resumo:
OBJECTIVE: To compare the management of invasive candidiasis between infectious disease and critical care specialists. DESIGN AND SETTING: Clinical case scenarios of invasive candidiasis were presented during interactive sessions at national specialty meetings. Participants responded to questions using an anonymous electronic voting system. PATIENTS AND PARTICIPANTS: Sixty-five infectious disease and 51 critical care physicians in Switzerland. RESULTS: Critical care specialists were more likely to ask advice from a colleague with expertise in the field of fungal infections to treat Candida glabrata (19.5% vs. 3.5%) and C. krusei (36.4% vs. 3.3%) candidemia. Most participants reported that they would change or remove a central venous catheter in the presence of candidemia, but 77.1% of critical care specialists would start concomitant antifungal treatment, compared to only 50% of infectious disease specialists. Similarly, more critical care specialists would start antifungal prophylaxis when Candida spp. are isolated from the peritoneal fluid at time of surgery for peritonitis resulting from bowel perforation (22.2% vs. 7.2%). The two groups equally considered Candida spp. as pathogens in tertiary peritonitis, but critical care specialists would more frequently use amphotericin B than fluconazole, caspofungin, or voriconazole. In mechanically ventilated patients the isolation of 10(4) Candida spp. from a bronchoalveolar lavage was considered a colonizing organism by 94.9% of infectious disease, compared to 46.8% of critical care specialists, with a marked difference in the use of antifungal agents (5.1% vs. 51%). CONCLUSIONS: These data highlight differences between management approaches for candidiasis in two groups of specialists, particularly in the reported use of antifungals.
Resumo:
BACKGROUND: Little is known about time trends, predictors, and consequences of changes made to antiretroviral therapy (ART) regimens early after patients initially start treatment. METHODS: We compared the incidence of, reasons for, and predictors of treatment change within 1 year after starting combination ART (cART), as well as virological and immunological outcomes at 1 year, among 1866 patients from the Swiss HIV Cohort Study who initiated cART during 2000--2001, 2002--2003, or 2004--2005. RESULTS: The durability of initial regimens did not improve over time (P = .15): 48.8% of 625 patients during 2000--2001, 43.8% of 607 during 2002--2003, and 44.3% of 634 during 2004--2005 changed cART within 1 year; reasons for change included intolerance (51.1% of all patients), patient wish (15.4%), physician decision (14.8%), and virological failure (7.1%). An increased probability of treatment change was associated with larger CD4+ cell counts, larger human immunodeficiency virus type 1 (HIV-1) RNA loads, and receipt of regimens that contained stavudine or indinavir/ritonavir, but a decreased probability was associated with receipt of regimens that contained tenofovir. Treatment discontinuation was associated with larger CD4+ cell counts, current use of injection drugs, and receipt of regimens that contained nevirapine. One-year outcomes improved between 2000--2001 and 2004--2005: 84.5% and 92.7% of patients, respectively, reached HIV-1 RNA loads of <50 copies/mL and achieved median increases in CD4+ cell counts of 157.5 and 197.5 cells/microL, respectively (P < .001 for all comparisons). CONCLUSIONS: Virological and immunological outcomes of initial treatments improved between 2000--2001 and 2004--2005, irrespective of uniformly high rates of early changes in treatment across the 3 study intervals.
Resumo:
BACKGROUND: People with neurological disease have a much higher risk of both faecal incontinence and constipation than the general population. There is often a fine line between the two conditions, with any management intended to ameliorate one risking precipitating the other. Bowel problems are observed to be the cause of much anxiety and may reduce quality of life in these people. Current bowel management is largely empirical with a limited research base. OBJECTIVES: To determine the effects of management strategies for faecal incontinence and constipation in people with neurological diseases affecting the central nervous system. SEARCH STRATEGY: We searched the Cochrane Incontinence Group Specialised Trials Register (searched 26 January 2005), the Cochrane Central Register of Controlled Trials (Issue 2, 2005), MEDLINE (January 1966 to May 2005), EMBASE (January 1998 to May 2005) and all reference lists of relevant articles. SELECTION CRITERIA: All randomised or quasi-randomised trials evaluating any types of conservative or surgical measure for the management of faecal incontinence and constipation in people with neurological diseases were selected. Specific therapies for the treatment of neurological diseases that indirectly affect bowel dysfunction were also considered. DATA COLLECTION AND ANALYSIS: Two reviewers assessed the methodological quality of eligible trials and two reviewers independently extracted data from included trials using a range of pre-specified outcome measures. MAIN RESULTS: Ten trials were identified by the search strategy, most were small and of poor quality. Oral medications for constipation were the subject of four trials. Cisapride does not seem to have clinically useful effects in people with spinal cord injuries (three trials). Psyllium was associated with increased stool frequency in people with Parkinson's disease but did not alter colonic transit time (one trial). Prucalopride, an enterokinetic did not demonstrate obvious benefits in this patient group (one study). Some rectal preparations to initiate defaecation produced faster results than others (one trial). Different time schedules for administration of rectal medication may produce different bowel responses (one trial). Mechanical evacuation may be more effective than oral or rectal medication (one trial). There appears to be a benefit to patients in one-off educational interventions from nurses. The clinical significance of any of these results is difficult to interpret. AUTHORS' CONCLUSIONS: There is still remarkably little research on this common and, to patients, very significant condition. It is not possible to draw any recommendation for bowel care in people with neurological diseases from the trials included in this review. Bowel management for these people must remain empirical until well-designed controlled trials with adequate numbers and clinically relevant outcome measures become available.
Resumo:
BACKGROUND: Adverse effects of combination antiretroviral therapy (CART) commonly result in treatment modification and poor adherence. METHODS: We investigated predictors of toxicity-related treatment modification during the first year of CART in 1318 antiretroviral-naive human immunodeficiency virus (HIV)-infected individuals from the Swiss HIV Cohort Study who began treatment between January 1, 2005, and June 30, 2008. RESULTS: The total rate of treatment modification was 41.5 (95% confidence interval [CI], 37.6-45.8) per 100 person-years. Of these, switches or discontinuations because of drug toxicity occurred at a rate of 22.4 (95% CI, 19.5-25.6) per 100 person-years. The most frequent toxic effects were gastrointestinal tract intolerance (28.9%), hypersensitivity (18.3%), central nervous system adverse events (17.3%), and hepatic events (11.5%). In the multivariate analysis, combined zidovudine and lamivudine (hazard ratio [HR], 2.71 [95% CI, 1.95-3.83]; P < .001), nevirapine (1.95 [1.01-3.81]; P = .050), comedication for an opportunistic infection (2.24 [1.19-4.21]; P = .01), advanced age (1.21 [1.03-1.40] per 10-year increase; P = .02), female sex (1.68 [1.14-2.48]; P = .009), nonwhite ethnicity (1.71 [1.18-2.47]; P = .005), higher baseline CD4 cell count (1.19 [1.10-1.28] per 100/microL increase; P < .001), and HIV-RNA of more than 5.0 log(10) copies/mL (1.47 [1.10-1.97]; P = .009) were associated with higher rates of treatment modification. Almost 90% of individuals with treatment-limiting toxic effects were switched to a new regimen, and 85% achieved virologic suppression to less than 50 copies/mL at 12 months compared with 87% of those continuing CART (P = .56). CONCLUSIONS: Drug toxicity remains a frequent reason for treatment modification; however, it does not affect treatment success. Close monitoring and management of adverse effects and drug-drug interactions are crucial for the durability of CART.