925 resultados para Palisade grass
Resumo:
The impact of invasive bank vole (Myodes glareolus) and greater white-toothed shrew (Crocidura russula) on indigenous Irish small mammals, varies with season and habitat. We caught bank voles in deciduous woodland, young coniferous plantations and open habitats such as rank grass. The greater white-toothed shrew was absent from deciduous woods and plantations but did use open habitats with low level cover in addition to field margins. Numbers of both invasive species in field margins during summer were higher than in the previous spring. The indigenous wood mouse (Apodemus sylvaticus) and pygmy shrew (Sorex minutus), differed in degrees of negative response to invasive species. Wood mice with bank voles in hedgerows had reduced recruitment and lower peak abundance. This effect was less extreme where both invasive species were present. Wood mice numbers along field margins and open habitats were significantly depressed by the presence of the bank vole with no such effect in deciduous woodland or coniferous plantations. Summer recruitment in pygmy shrews was reduced in hedgerows with bank voles. Where greater white-toothed shrew was present, the pygmy shrew was entirely absent from field margins. Species replacement due to invasive small mammals is occurring in their major habitat i.e. field margins and open habitats where there is good ground cover. Pygmy shrew will probably disappear from these habitats throughout Ireland. Wood mice and possibly pygmy shrew may survive in deciduous woodland and conifer plantations. Mitigation of impacts of invasive species should include expansion of woodland in which native species can survive.
Resumo:
Understanding the dietary consumption and selection of wild populations of generalist herbivores is hampered by the complex array of factors. Here, we determine the influence of habitat, season, and animal density, sex, and age on the diet consumption and selection of 426 red deer (Cervus elaphus scoticus) culled in Fiordland National Park, New Zealand. Our site differs from studies elsewhere both in habitat (evergreen angiosperm-dominated forests) and the intensity of hunting pressures. We predicted that deer would not consume forage in proportion to its relative availability, and that dietary consumption would change among and within years in response to hunting pressures that would also limit opportunities for age and sex segregation. Using canonical correspondence analysis, we evaluated the relative importance of different drivers of variation in diet consumption assessed from gut content and related these to available forage in the environment. We found that altitude explained the largest proportion of variation in diet consumption, reflecting the ability of deer to alter their consumption and selection in relation to their foraging grounds. Grasses formed a high proportion of the diet consumption, even for deer culled several kilometres from the alpine grasslands. In the winter months, when the alpine grasslands were largely inaccessible, less grass was eaten and deer resorted to woody plants that were avoided in the summer months. Surprisingly, there were no significant dietary differences between adults and juveniles and only subtle differences between the sexes. Sex-based differences in diet consumption are commonly observed in ungulate species and we suggest that they may have been reduced in our study area owing to decreased heterogeneity in available forage as the diversity of palatable species decreased under high deer browsing pressures, or by intense hunting pressure. © 2009 The Authors. Journal compilation © 2009 Ecological Society of Australia.
Resumo:
Drawing on the ‘from below’ perspective which has emerged in transitional justice scholarship and practice
over the past two decades, this article critically examines the dealing with the past debate in Northern
Ireland. The paper begins by offering an outline of the from below perspective in the context of post-conflict
or post-authoritarian societies which are struggling to come to terms with past violence and human rights
abuses. Having provided some of the legal and political background to the most recent efforts to deal with
the past in Northern Ireland, it then critically examines the relevant past-related provisions of the Stormont
House Agreement, namely the institutions which are designed to facilitate ‘justice’, truth recovery and the
establishment of an Oral History Archive. Drawing from the political science and social movement
literature on lobbying and the ways in which interests groups may seek to influence policy, the paper then
explores the efforts of the authors and others to contribute to the broader public debate, including through
drafting and circulating a ‘Model Bill’ on dealing with the past (reproduced elsewhere in this issue) as a
counterweight to the legislation which is required from the British government to implement the Stormont
House Agreement. The authors argue that the combination of technical capacity, grass-roots
credibility and ‘international-savvy’ local solutions offers a framework for praxis from below in other
contexts where activists are struggling to extend ownership of transitional justice beyond political elites.
Keywords: transitional justice; from below; dealing with the past; legislation; truth
recovery; prosecutions; oral history
Resumo:
Recent epidemics of acute asthma have caused speculation that, if their causes were known, early warnings might be feasible. In particular, some epidemics seemed to be associated with thunderstorms. We wondered what risk factors predicting epidemics could be identified. Daily asthma admissions counts during 1987-1994, for two age groups (0-14 yrs and > or = 15 yrs), were measured using the Hospital Episodes System (HES). Epidemics were defined as combinations of date, age group and English Regional Health Authority (RHA) with exceptionally high asthma admission counts compared to the predictions of a log-linear autoregression model. They were compared with control days 1 week before and afterwards, regarding seven meteorological variables and 5 day average pollen counts for four species. Fifty six asthma epidemics were identified. The mean density of sferics (lightning flashes), temperature and rainfall on epidemic days were greater than those on control days. High sferics densities were overrepresented in epidemics. Simultaneously high sferics and grass pollen further increased the probability of an epidemic, but only to 15% (95% confidence interval 2-45%). Two thirds of epidemics were not preceded by thunderstorms. Thunderstorms and high grass pollen levels precede asthma epidemics more often than expected by chance. However, most epidemics are not associated with thunderstorms or unusual weather conditions, and most thunderstorms, even following high grass pollen levels, do not precede epidemics. An early warning system based on the indicators examined here would, therefore, detect few epidemics and generate an unacceptably high rate of false alarms.
Resumo:
Background A large epidemic of asthma occurred following a thunderstorm in southern and central England on 24/25 June 1994. A collaborative study group was formed. Objectives To describe the epidemic and the meteorological, aerobiological and other environmental characteristics associated with it. Methods Collation of data from the Meteorological Office, the Pollen Research Unit, the Department of the Environment's Automatic Urban Network, from health surveillance by the Department of Health and the National Poisons Unit, from clinical experience in general practice and hospitals, and from an immunological study of some of the affected cases from north east London. Results The thunderstorm was a Mesoscale Convective System, an unusual and large form of storm with several centres and severe wind gusts. It occurred shortly after the peak grass pollen concentration in the London area. A sudden and extensive epidemic occurred within about an hour affecting possibly several thousand patients. Emergency services were stretched but the epidemic did not last long. Cases had high serum levels of IgE antibody to mixed grass pollen. Conclusion This study supports the view that patients with specific IgE to grass pollen are at risk of thunderstorm-related asthma. The details of the causal pathway from storm to asthma attack are not clear. Case-control and time series studies are being carried out.
Resumo:
The main aim of this study was to analyse the temporal and spatial variations in grass (Poaceae) pollen counts (2005–2011) recorded in Évora (Portugal), Badajoz (Spain) and Worcester (UK). Weekly average data were examined using nonparametric statistics to compare differences between places. On average, Évora recorded the earliest start dates of the Poaceae pollen seasons and Worcester the latest. The intensity of the Poaceae pollen season varied between sites, with Worcester usually recording the least and Évora the most grass pollen in a season. Mean durations of grass pollen seasons were 77 days in Évora, 78 days in Badajoz and 59 days in Worcester. Overall, longer Poaceae pollen seasons coincided with earlier pollen season start dates. Weekly pollen data, from March to September, from the three pollen-monitoring stations studied were compared. The best fit and most statistically significant correlations were obtained by moving Worcester data backward by 4 weeks (Évora, r = 0.810, p < 0.001) and 5 weeks (Badajoz,r = 0.849, p < 0.001). Weekly data from Worcester therefore followed a similar pattern to that of Badajoz and Évora but at a distance of more than 1,500 km and 4–5 weeks later. The sum of pollen recorded in a season was compared with monthly rainfall between January and May. The strongest positive relationship between season intensity and rainfall was between the annual sum of Poaceae pollen recorded in the season at Badajoz and Évora and total rainfall during January and February. Winter rainfall noticeably affects the intensity of Poaceae pollen seasons in Mediterranean areas, but this was not as important in Worcester.
Resumo:
O trabalho apresentado centra-se na determinação dos custos de construção de condutas de pequenos e médios diâmetros em Polietileno de Alta Densidade (PEAD) para saneamento básico, tendo como base a metodologia descrita no livro Custos de Construção e Exploração – Volume 9 da série Gestão de Sistemas de Saneamento Básico, de Lencastre et al. (1994). Esta metodologia descrita no livro já referenciado, nos procedimentos de gestão de obra, e para tal foram estimados custos unitários de diversos conjuntos de trabalhos. Conforme Lencastre et al (1994), “esses conjuntos são referentes a movimentos de terras, tubagens, acessórios e respetivos órgãos de manobra, pavimentações e estaleiro, estando englobado na parte do estaleiro trabalhos acessórios correspondentes à obra.” Os custos foram obtidos analisando vários orçamentos de obras de saneamento, resultantes de concursos públicos de empreitadas recentemente realizados. Com vista a tornar a utilização desta metodologia numa ferramenta eficaz, foram organizadas folhas de cálculo que possibilitam obter estimativas realistas dos custos de execução de determinada obra em fases anteriores ao desenvolvimento do projeto, designadamente numa fase de preparação do plano diretor de um sistema ou numa fase de elaboração de estudos de viabilidade económico-financeiros, isto é, mesmo antes de existir qualquer pré-dimensionamento dos elementos do sistema. Outra técnica implementada para avaliar os dados de entrada foi a “Análise Robusta de Dados”, Pestana (1992). Esta metodologia permitiu analisar os dados mais detalhadamente antes de se formularem hipóteses para desenvolverem a análise de risco. A ideia principal é o exame bastante flexível dos dados, frequentemente antes mesmo de os comparar a um modelo probabilístico. Assim, e para um largo conjunto de dados, esta técnica possibilitou analisar a disparidade dos valores encontrados para os diversos trabalhos referenciados anteriormente. Com os dados recolhidos, e após o seu tratamento, passou-se à aplicação de uma metodologia de Análise de Risco, através da Simulação de Monte Carlo. Esta análise de risco é feita com recurso a uma ferramenta informática da Palisade, o @Risk, disponível no Departamento de Engenharia Civil. Esta técnica de análise quantitativa de risco permite traduzir a incerteza dos dados de entrada, representada através de distribuições probabilísticas que o software disponibiliza. Assim, para por em prática esta metodologia, recorreu-se às folhas de cálculo que foram realizadas seguindo a abordagem proposta em Lencastre et al (1994). A elaboração e a análise dessas estimativas poderão conduzir à tomada de decisões sobre a viabilidade da ou das obras a realizar, nomeadamente no que diz respeito aos aspetos económicos, permitindo uma análise de decisão fundamentada quanto à realização dos investimentos.
Resumo:
C4 photosynthesis is an adaptation derived from the more common C3 photosynthetic pathway that confers a higher productivity under warm temperature and low atmospheric CO2 concentration [1, 2]. C4 evolution has been seen as a consequence of past atmospheric CO2 decline, such as the abrupt CO2 fall 32-25 million years ago (Mya) [3-6]. This relationship has never been tested rigorously, mainly because of a lack of accurate estimates of divergence times for the different C4 lineages [3]. In this study, we inferred a large phylogenetic tree for the grass family and estimated, through Bayesian molecular dating, the ages of the 17 to 18 independent grass C4 lineages. The first transition from C3 to C4 photosynthesis occurred in the Chloridoideae subfamily, 32.0-25.0 Mya. The link between CO2 decrease and transition to C4 photosynthesis was tested by a novel maximum likelihood approach. We showed that the model incorporating the atmospheric CO2 levels was significantly better than the null model, supporting the importance of CO2 decline on C4 photosynthesis evolvability. This finding is relevant for understanding the origin of C4 photosynthesis in grasses, which is one of the most successful ecological and evolutionary innovations in plant history.
Resumo:
The evolution of grasses using C4 photosynthesis and their sudden rise to ecological dominance 3 to 8 million years ago is among the most dramatic examples of biome assembly in the geological record. A growing body of work suggests that the patterns and drivers of C4 grassland expansion were considerably more complex than originally assumed. Previous research has benefited substantially from dialog between geologists and ecologists, but current research must now integrate fully with phylogenetics. A synthesis of grass evolutionary biology with grassland ecosystem science will further our knowledge of the evolution of traits that promote dominance in grassland systems and will provide a new context in which to evaluate the relative importance of C4 photosynthesis in transforming ecosystems across large regions of Earth.
Resumo:
Background/Objectives:There is strong evidence for the beneficial effects of perioperative nutrition in patients undergoing major surgery. We aimed to evaluate implementation of current guidelines in Switzerland and Austria.Subjects/Methods:A survey was conducted in 173 Swiss and Austrian surgical departments. We inquired about nutritional screening, perioperative nutrition and estimated clinical significance.Results:The overall response rate was 55%, having 69% (54/78) responders in Switzerland and 44% (42/95) in Austria. Most centres were aware of reduced complications (80%) and shorter hospital stay (59%). However, only 20% of them implemented routine nutritional screening. Non-compliance was because of financial (49%) and logistic restrictions (33%). Screening was mainly performed in the outpatient's clinic (52%) or during admission (54%). The nutritional risk score was applied by 14% only; instead, various clinical (78%) and laboratory parameters (56%) were used. Indication for perioperative nutrition was based on preoperative screening in 49%. Although 23% used preoperative nutrition, 68% applied nutritional support pre- and postoperatively. Preoperative nutritional treatment ranged from 3 days (33%), to 5 (31%) and even 7 days (20%).Conclusions:Although malnutrition is a well-recognised risk factor for poor post-operative outcome, surgeons remain reluctant to implement routine screening and nutritional support according to evidence-based guidelines.
Resumo:
Introduction. Preoperative malnutrition is a major risk factor for increased postoperative morbidity and mortality. Definition and diagnosis of malnutrition and its treatment is still subject for controversy. Furthermore, practical implementation of nutrition-related guidelines is unknown. Methods. A review of the available literature and of current guidelines on perioperative nutrition was conducted. We focused on nutritional screening and perioperative nutrition in patients undergoing digestive surgery, and we assessed translation of recent guidelines in clinical practice. Results and Conclusions. Malnutrition is a well-recognized risk factor for poor postoperative outcome. The prevalence of malnutrition depends largely on its definition; about 40% of patients undergoing major surgery fulfil current diagnostic criteria of being at nutritional risk. The Nutritional Risk Score is a pragmatic and validated tool to identify patients who should benefit from nutritional support. Adequate nutritional intervention entails reduced (infectious) complications, hospital stay, and costs. Preoperative oral supplementation of a minimum of five days is preferable; depending on the patient and the type of surgery, immune-enhancing formulas are recommended. However, surgeons' compliance with evidence-based guidelines remains poor and efforts are necessary to implement routine nutritional screening and nutritional support.
Resumo:
Objective: To assess if screening programs and treatment of preoperative malnutrition have been implemented into surgical practice to decrease morbidity. There is strong evidence that postoperative morbidity can be minimized by early identifying and treating patients at nutritional risk before major surgery.The validated nutritional risk score (NRS) is recommended by the European Society of Parenteral and Enteral Nutrition for nutritional screening. It remains unclear whether routine preoperative nutritional assessment and perioperative nutrition is widely implemented.Methods: A survey was conducted in 173 Swiss and Austrian surgical departments. Implementation of nutritional screening, perioperative nutrition, and estimated impact on clinical outcome were assessed. Non-responders were repeatedly contacted by the authors.Results: The overall response rate was 55%, whereby 69% (54/78) of Swiss and 44% (42/95) of Austrian centers responded. Despite 80% and 59% of the responding centers are aware of a reduced complication rate and shortened hospital stay, respectively, only 20% of them implemented routine nutritional screening. Financial (49%) and logistic restrictions (33%) are the predominant reasons against the routine clinical use. Screening is mainly performed either in the outpatient's clinic (52%) or during admission (54%). The NRS is only used by 14%. Instead, various clinical (78%), e.g. BMI and laboratory findings (56%), e.g. albumine, are used. Indication for perioperative nutrition is based on preoperative screening in 49%.While 23% use preoperative nutrition, 68% apply nutritional support pre- and postoperatively. Preoperative nutritional treatment ranged from three days (33%), to five days (31%) and even seven days (20%).Conclusion: Despite malnutrition is well recognized as major risk factor for increased postoperative morbidity, the majority of surgeons are reluctant to implement routine screening and nutritional support. If nutritional assessment is performed, local institutional screening parameters are still preferred. It remains difficult to overcome traditions, and to change surgeon's mind.
Resumo:
Background: Patients undergoing major gastrointestinal surgery are at increased risk of developing complications. The use of immunonutrition (IN) in such patients is not widespread because the available data are heterogeneous, and some show contradictory results with regard to complications, mortality and length of hospital stay. Methods: Randomized controlled trials (RCTs) published between January 1985 and September 2009 that assessed the clinical impact of perioperative enteral IN in major gastrointestinal elective surgery were included in a meta-analysis. Results: Twenty-one RCTs enrolling a total of 2730 patients were included in the meta-analysis. Twelve were considered as high-quality studies. The included studies showed significant heterogeneity with respect to patients, control groups, timing and duration of IN, which limited group analysis. IN significantly reduced overall complications when used before surgery (odds ratio (OR) 0.48, 95 per cent confidence interval (c.i.) 0.34 to 0.69), both before and after operation (OR 0.39, 0.28 to 0.54) or after surgery (OR 0.46, 0.25 to 0.84). For these three timings of IN administration, ORs of postoperative infection were 0.36 (0.24 to 0.56), 0.41 (0.28 to 0.58) and 0.53 (0.40 to 0.71) respectively. Use of IN led to a shorter hospital stay: mean difference -2.12 (95 per cent c.i. -2.97 to -1.26) days. Beneficial effects of IN were confirmed when low-quality trials were excluded. Perioperative IN had no influence on mortality (OR 0.90, 0.46 to 1.76). Conclusion: Perioperative enteral IN decreases morbidity and hospital stay but not mortality after major gastrointestinal surgery; its routine use can be recommended.
Resumo:
BACKGROUND: The aim of the current study was to assess whether widely used nutritional parameters are correlated with the nutritional risk score (NRS-2002) to identify postoperative morbidity and to evaluate the role of nutritionists in nutritional assessment. METHODS: A randomized trial on preoperative nutritional interventions (NCT00512213) provided the study cohort of 152 patients at nutritional risk (NRS-2002 ≥3) with a comprehensive phenotyping including diverse nutritional parameters (n=17), elaborated by nutritional specialists, and potential demographic and surgical (n=5) confounders. Risk factors for overall, severe (Dindo-Clavien 3-5) and infectious complications were identified by univariate analysis; parameters with P<0.20 were then entered in a multiple logistic regression model. RESULTS: Final analysis included 140 patients with complete datasets. Of these, 61 patients (43.6%) were overweight, and 72 patients (51.4%) experienced at least one complication of any degree of severity. Univariate analysis identified a correlation between few (≤3) active co-morbidities (OR=4.94; 95% CI: 1.47-16.56, p=0.01) and overall complications. Patients screened as being malnourished by nutritional specialists presented less overall complications compared to the not malnourished (OR=0.47; 95% CI: 0.22-0.97, p=0.043). Severe postoperative complications occurred more often in patients with low lean body mass (OR=1.06; 95% CI: 1-1.12, p=0.028). Few (≤3) active co-morbidities (OR=8.8; 95% CI: 1.12-68.99, p=0.008) were related with postoperative infections. Patients screened as being malnourished by nutritional specialists presented less infectious complications (OR=0.28; 95% CI: 0.1-0.78), p=0.014) as compared to the not malnourished. Multivariate analysis identified few co-morbidities (OR=6.33; 95% CI: 1.75-22.84, p=0.005), low weight loss (OR=1.08; 95% CI: 1.02-1.14, p=0.006) and low hemoglobin concentration (OR=2.84; 95% CI: 1.22-6.59, p=0.021) as independent risk factors for overall postoperative complications. Compliance with nutritional supplements (OR=0.37; 95% CI: 0.14-0.97, p=0.041) and supplementation of malnourished patients as assessed by nutritional specialists (OR=0.24; 95% CI: 0.08-0.69, p=0.009) were independently associated with decreased infectious complications. CONCLUSIONS: Nutritional support based upon NRS-2002 screening might result in overnutrition, with potentially deleterious clinical consequences. We emphasize the importance of detailed assessment of the nutritional status by a dedicated specialist before deciding on early nutritional intervention for patients with an initial NRS-2002 score of ≥3.