941 resultados para wildlife damage management
Resumo:
Forest management not only affects biodiversity but also might alter ecosystem processes mediated by the organisms, i.e. herbivory the removal of plant biomass by plant-eating insects and other arthropod groups. Aiming at revealing general relationships between forest management and herbivory we investigated aboveground arthropod herbivory in 105 plots dominated by European beech in three different regions in Germany in the sun-exposed canopy of mature beech trees and on beech saplings in the understorey. We separately assessed damage by different guilds of herbivores, i.e. chewing, sucking and scraping herbivores, gall-forming insects and mites, and leaf-mining insects. We asked whether herbivory differs among different forest management regimes (unmanaged, uneven-aged managed, even-aged managed) and among age-classes within even-aged forests. We further tested for consistency of relationships between regions, strata and herbivore guilds. On average, almost 80 of beech leaves showed herbivory damage, and about 6 of leaf area was consumed. Chewing damage was most common, whereas leaf sucking and scraping damage were very rare. Damage was generally greater in the canopy than in the understorey, in particular for chewing and scraping damage, and the occurrence of mines. There was little difference in herbivory among differently managed forests and the effects of management on damage differed among regions, strata and damage types. Covariates such as wood volume, tree density and plant diversity weakly influenced herbivory, and effects differed between herbivory types. We conclude that despite of the relatively low number of species attacking beech; arthropod herbivory on beech is generally high. We further conclude that responses of herbivory to forest management are multifaceted and environmental factors such as forest structure variables affecting in particular microclimatic conditions are more likely to explain the variability in herbivory among beech forest plots.
Resumo:
Human–wildlife conflict is emerging as an important topic in conservation. Carnivores and birds of prey are responsible for most conflicts with livestock and game but since the mid 1990s a new conflict is emerging in south-west Europe: the presumed killing of livestock by griffon vultures Gyps fulvus. Lack of scientific data and magnification of the problem by the media are increasing alarm amongst the public, and political pressures to implement management decisions have not been based on scientific evidence. We compiled information on 1,793 complaints about attacks by griffon vultures on livestock, lodged with Spanish authorities from 1996 to 2010. Spain is home to the majority (95%) of griffon vultures and other scavengers in the European Union. Most of the cases occurred in areas of high livestock density, affected principally sheep (49%) and cows (31%), and were associated with spring birthing times (April–June). On average 69% of the complaints made annually were rejected because of a lack of evidence about whether the animal was alive before being eaten. The total economic cost of compensation was EUR 278,590 from 2004 to 2010. We discuss possible ways to mitigate this emerging human–wildlife conflict. These need to include the participation of livestock farmers, authorities, scientists and conservation groups.
Resumo:
INTRODUCTION To present the accuracy of reduction, complications and results two years after open reduction and internal fixation of displaced acetabular fractures involving the anterior column (AC) through the Pararectus approach. Frequencies for conversion to total hip replacement in the early follow up, the clinical outcome in preserved hips, and the need for an extension of the approach (1st window of the ilioinguinal approach) are compared to the literature about the modified Stoppa approach. METHODS Forty-eight patients (mean age 62 years, range: 16–98; 41 male) with displaced acetabular fractures involving the AC (AC: n = 9; transverse fracture: n = 2; AC and hemitransverse: n = 24; both column: n = 13) were treated between 12/2009 and 12/2011 using the Pararectus approach. Surgical data and accuracy of reduction (using computed tomography) were assessed. Patients were routinely followed up at eight weeks, 6, 12 and 24 months postoperatively. Failure was defined as the need for total hip arthroplasty. Twenty-four months postoperatively the outcome was rated according to Matta. RESULTS In four patients there were four intraoperative complications (minor vascular damage in two, small perforations of the peritoneum in two) which were managed intraoperatively. Fracture reduction showed statistically significant decreases (mean ± SD, pre- vs. postoperative, in mm) in “step-offs”: 2.6 ± 1.9 vs. 0.1 ± 0.3, p < 0.001 and “gaps”: 11.2 ± 6.8 vs. 0.7 ± 0.9, p < 0.001. Accuracy of reduction was “anatomical” in 45, “imperfect” in three. Five (13%) from 38 available patients required a total hip arthroplasty. Of 33 patients with a preserved hip the clinical outcome was graded as “excellent” in 13 or “good” in 20; radiographically, 27 were graded as “excellent”, four as “good” and two as “fair”. An extension of the approach was infrequently used (1st window ilioinguinal approach in 2%, mini-incision at the iliac crest in 21%). CONCLUSION In the treatment of acetabular fractures involving the anterior column the Pararectus approach allowed for anatomic restoration with minimal access morbidity. Results obtained by means of the Pararectus approach after two years at least parallel those reported after utilisation of the modified Stoppa approach. In contrast to the modified Stoppa approach, a relevant extension of the Pararectus approach was almost not necessary.
Resumo:
Most forests are exposed to anthropogenic management activities that affect tree species composition and natural ecosystem processes. Changes in ecosystem processes such as herbivory depend on management intensity, and on regional environmental conditions and species pools. Whereas influences of specific forest management measures have already been addressed for different herbivore taxa on a local scale, studies considering effects of different aspects of forest management across different regions are rare. We assessed the influence of tree species composition and intensity of harvesting activities on arthropod herbivores and herbivore-related damage to beech trees, Fagus sylvatica, in 48 forest plots in three regions of Germany. We found that herbivore abundance and damage to beech trees differed between regions and that – despite the regional differences - density of tree-associated arthropod taxa and herbivore damage were consistently affected by tree species composition and harvest intensity. Specifically, overall herbivore damage to beech trees increased with increasing dominance of beech trees – suggesting the action of associational resistance processes – and decreased with harvest intensity. The density of leaf chewers and mines was positively related to leaf damage, and several arthropod groups responded to beech dominance and harvest intensity. The distribution of damage patterns was consistent with a vertical shift of herbivores to higher crown layers during the season and with higher beech dominance. By linking quantitative data on arthropod herbivore abundance and herbivory with tree species composition and harvesting activity in a wide variety of beech forests, our study helps to better understand the influence of forest management on interactions between a naturally dominant deciduous forest tree and arthropod herbivores.
Resumo:
The presented approach describes a model for a rule-based expert system calculating the temporal variability of the release of wet snow avalanches, using the assumption of avalanche triggering without the loading of new snow. The knowledge base of the model is created by using investigations on the system behaviour of wet snow avalanches in the Italian Ortles Alps, and is represented by a fuzzy logic rule-base. Input parameters of the expert system are numerical and linguistic variables, measurable meteorological and topographical factors and observable characteristics of the snow cover. Output of the inference method is the quantified release disposition for wet snow avalanches. Combining topographical parameters and the spatial interpolation of the calculated release disposition a hazard index map is dynamically generated. Furthermore, the spatial and temporal variability of damage potential on roads exposed to wet snow avalanches can be quantified, expressed by the number of persons at risk. The application of the rule base to the available data in the study area generated plausible results. The study demonstrates the potential for the application of expert systems and fuzzy logic in the field of natural hazard monitoring and risk management.
Resumo:
When determining risk related to natural hazard processes, many studies neglect the investigations of the damage potential or are limited to the assessment of immobile values like buildings. However, persons as well as mobile values form an essential part of the damage potential. Knowledge of the maximum number of exposed persons in an endangered area is of great importance for elaborating evacuation plans and immediate measures in case of catastrophes. In addition, motor vehicles can also be highly damaged, as was shown by the analysis of avalanche events. With the removal of mobile values in time as a preventive measure this kind of damage can be minimised. This study presents a method for recording the maximum number of exposed persons and monetarily assessing motor vehicles in the municipality of Galt¨ur (Tyrol, Austria). Moreover, general developments of the damage potential due to significant socio-economic changes since the mid-twentieth century are pointed out in the study area. The present situation of the maximum number of persons and mobile values in the official avalanche hazard zones of the municipality is described in detail. Information on the number of persons is derived of census data, tourism and employment statistics. During the winter months, a significant increase overlaid by strong short-term fluctuation in the number of persons can be noted. These changes result from a higher demand of tourism related manpower as well as from varying occupancy rates. The number of motor vehicles in endangered areas is closely associated to the number of exposed persons. The potential number of motor vehicles is investigated by means of mapping, statistics on the stock of motor vehicles and the density distribution. Diurnal and seasonal fluctuations of the investigated damage potential are pointed out. The recording of the number of persons and mobile values in endangered areas is vital for any disaster management.
Resumo:
The fatality risk caused by avalanches on road networks can be analysed using a long-term approach, resulting in a mean value of risk, and with emphasis on short-term fluctuations due to the temporal variability of both, the hazard potential and the damage potential. In this study, the approach for analysing the long-term fatality risk has been adapted by modelling the highly variable short-term risk. The emphasis was on the temporal variability of the damage potential and the related risk peaks. For defined hazard scenarios resulting from classified amounts of snow accumulation, the fatality risk was calculated by modelling the hazard potential and observing the traffic volume. The avalanche occurrence probability was calculated using a statistical relationship between new snow height and observed avalanche releases. The number of persons at risk was determined from the recorded traffic density. The method resulted in a value for the fatality risk within the observed time frame for the studied road segment. The long-term fatality risk due to snow avalanches as well as the short-term fatality risk was compared to the average fatality risk due to traffic accidents. The application of the method had shown that the long-term avalanche risk is lower than the fatality risk due to traffic accidents. The analyses of short-term avalanche-induced fatality risk provided risk peaks that were 50 times higher than the statistical accident risk. Apart from situations with high hazard level and high traffic density, risk peaks result from both, a high hazard level combined with a low traffic density and a high traffic density combined with a low hazard level. This provided evidence for the importance of the temporal variability of the damage potential for risk simulations on road networks. The assumed dependence of the risk calculation on the sum of precipitation within three days is a simplified model. Thus, further research is needed for an improved determination of the diurnal avalanche probability. Nevertheless, the presented approach may contribute as a conceptual step towards a risk-based decision-making in risk management.
Resumo:
Keel bone damage (KBD) is a critical issue facing the laying hen industry today as a result of the likely pain leading to compromised welfare and the potential for reduced productivity. Recent reports suggest that damage, while highly variable and likely dependent on a host of factors, extends to all systems (including battery cages, furnished cages, and non-cage systems), genetic lines, and management styles. Despite the extent of the problem, the research community remains uncertain as to the causes and influencing factors of KBD. Although progress has been made investigating these factors, the overall effort is hindered by several issues related to the assessment of KBD, including quality and variation in the methods used between research groups. These issues prevent effective comparison of studies, as well as difficulties in identifying the presence of damage leading to poor accuracy and reliability. The current manuscript seeks to resolve these issues by offering precise definitions for types of KBD, reviewing methods for assessment, and providing recommendations that can improve the accuracy and reliability of those assessments.
Resumo:
Keel bone damage (KBD) is a critical issue facing the contemporary laying hen industry due to the likely pain leading to compromised welfare and reduced productivity. Recent reports suggest that KBD, while highly variable and likely dependent on a host of factors, extends to all housing systems (including traditional battery cages, furnished cages and non-cage systems), genetic lines, and management styles. Despite the extent of the problem, the research community remains uncertain as to the causes and influencing factors of KBD. To combat these issues, the current review was produced following discussions from the 1st International Keel Bone Damage Workshop held in Switzerland in April 2014. This exercise sought to assess current knowledge, foster novel collaborations, propose unique methodologies and highlight the key areas where innovative research is needed. The following paper is based on the content of those discussions and presents nine recommendations for future research efforts.
Resumo:
— In 2000, according to the World Health Organization, at least 171 million people, 2.8% of the population worldwide, suffered from diabetes. The Centres for Disease Control has defined it as an epidemic disease. Its incidence is increasing rapidly, and it is estimated that by 2030 this number will almost double. Diabetes mellitus occurs throughout the world, but is more common (especially type 2) in the more developed countries. Diabetes is a chronic condition that occurs when pancreas does not assure enough insulin secretion or when the body does not consume the insulin produced. Insulin is a hormone that regulates blood sugar. The effect of uncontrolled diabetes is the hyperglycaemia (blood sugar), which eventually seriously damage many organs and systems, especially the nerves and blood vessels. Diabetes type 2 (most common type of diabetes) is highly correlated with elderly people, obesity or overweight. Promoting a healthy lifestyle helps patients to improve their quality of life and in many cases to avoid complications related to the disease. This paper is intended to describe an iPhone-based application for self-management of type 2 diabetic patients, which allow them improving their lifestyle through healthy diet, physical activity and education
Resumo:
After the 2010 Haiti earthquake, that hits the city of Port-au-Prince, capital city of Haiti, a multidisciplinary working group of specialists (seismologist, geologists, engineers and architects) from different Spanish Universities and also from Haiti, joined effort under the SISMO-HAITI project (financed by the Universidad Politecnica de Madrid), with an objective: Evaluation of seismic hazard and risk in Haiti and its application to the seismic design, urban planning, emergency and resource management. In this paper, as a first step for a structural damage estimation of future earthquakes in the country, a calibration of damage functions has been carried out by means of a two-stage procedure. After compiling a database with observed damage in the city after the earthquake, the exposure model (building stock) has been classified and through an iteratively two-step calibration process, a specific set of damage functions for the country has been proposed. Additionally, Next Generation Attenuation Models (NGA) and Vs30 models have been analysed to choose the most appropriate for the seismic risk estimation in the city. Finally in a next paper, these functions will be used to estimate a seismic risk scenario for a future earthquake.
Resumo:
Las Field-Programmable Gate Arrays (FPGAs) SRAM se construyen sobre una memoria de configuración de tecnología RAM Estática (SRAM). Presentan múltiples características que las hacen muy interesantes para diseñar sistemas empotrados complejos. En primer lugar presentan un coste no-recurrente de ingeniería (NRE) bajo, ya que los elementos lógicos y de enrutado están pre-implementados (el diseño de usuario define su conexionado). También, a diferencia de otras tecnologías de FPGA, pueden ser reconfiguradas (incluso en campo) un número ilimitado de veces. Es más, las FPGAs SRAM de Xilinx soportan Reconfiguración Parcial Dinámica (DPR), la cual permite reconfigurar la FPGA sin interrumpir la aplicación. Finalmente, presentan una alta densidad de lógica, una alta capacidad de procesamiento y un rico juego de macro-bloques. Sin embargo, un inconveniente de esta tecnología es su susceptibilidad a la radiación ionizante, la cual aumenta con el grado de integración (geometrías más pequeñas, menores tensiones y mayores frecuencias). Esta es una precupación de primer nivel para aplicaciones en entornos altamente radiativos y con requisitos de alta confiabilidad. Este fenómeno conlleva una degradación a largo plazo y también puede inducir fallos instantáneos, los cuales pueden ser reversibles o producir daños irreversibles. En las FPGAs SRAM, los fallos inducidos por radiación pueden aparecer en en dos capas de arquitectura diferentes, que están físicamente superpuestas en el dado de silicio. La Capa de Aplicación (o A-Layer) contiene el hardware definido por el usuario, y la Capa de Configuración contiene la memoria de configuración y la circuitería de soporte. Los fallos en cualquiera de estas capas pueden hacer fracasar el sistema, lo cual puede ser ás o menos tolerable dependiendo de los requisitos de confiabilidad del sistema. En el caso general, estos fallos deben gestionados de alguna manera. Esta tesis trata sobre la gestión de fallos en FPGAs SRAM a nivel de sistema, en el contexto de sistemas empotrados autónomos y confiables operando en un entorno radiativo. La tesis se centra principalmente en aplicaciones espaciales, pero los mismos principios pueden aplicarse a aplicaciones terrenas. Las principales diferencias entre ambas son el nivel de radiación y la posibilidad de mantenimiento. Las diferentes técnicas para la gestión de fallos en A-Layer y C-Layer son clasificados, y sus implicaciones en la confiabilidad del sistema son analizados. Se proponen varias arquitecturas tanto para Gestores de Fallos de una capa como de doble-capa. Para estos últimos se propone una arquitectura novedosa, flexible y versátil. Gestiona las dos capas concurrentemente de manera coordinada, y permite equilibrar el nivel de redundancia y la confiabilidad. Con el objeto de validar técnicas de gestión de fallos dinámicas, se desarrollan dos diferentes soluciones. La primera es un entorno de simulación para Gestores de Fallos de C-Layer, basado en SystemC como lenguaje de modelado y como simulador basado en eventos. Este entorno y su metodología asociada permite explorar el espacio de diseño del Gestor de Fallos, desacoplando su diseño del desarrollo de la FPGA objetivo. El entorno incluye modelos tanto para la C-Layer de la FPGA como para el Gestor de Fallos, los cuales pueden interactuar a diferentes niveles de abstracción (a nivel de configuration frames y a nivel físico JTAG o SelectMAP). El entorno es configurable, escalable y versátil, e incluye capacidades de inyección de fallos. Los resultados de simulación para algunos escenarios son presentados y comentados. La segunda es una plataforma de validación para Gestores de Fallos de FPGAs Xilinx Virtex. La plataforma hardware aloja tres Módulos de FPGA Xilinx Virtex-4 FX12 y dos Módulos de Unidad de Microcontrolador (MCUs) de 32-bits de propósito general. Los Módulos MCU permiten prototipar Gestores de Fallos de C-Layer y A-Layer basados en software. Cada Módulo FPGA implementa un enlace de A-Layer Ethernet (a través de un switch Ethernet) con uno de los Módulos MCU, y un enlace de C-Layer JTAG con el otro. Además, ambos Módulos MCU intercambian comandos y datos a través de un enlace interno tipo UART. Al igual que para el entorno de simulación, se incluyen capacidades de inyección de fallos. Los resultados de pruebas para algunos escenarios son también presentados y comentados. En resumen, esta tesis cubre el proceso completo desde la descripción de los fallos FPGAs SRAM inducidos por radiación, pasando por la identificación y clasificación de técnicas de gestión de fallos, y por la propuesta de arquitecturas de Gestores de Fallos, para finalmente validarlas por simulación y pruebas. El trabajo futuro está relacionado sobre todo con la implementación de Gestores de Fallos de Sistema endurecidos para radiación. ABSTRACT SRAM-based Field-Programmable Gate Arrays (FPGAs) are built on Static RAM (SRAM) technology configuration memory. They present a number of features that make them very convenient for building complex embedded systems. First of all, they benefit from low Non-Recurrent Engineering (NRE) costs, as the logic and routing elements are pre-implemented (user design defines their connection). Also, as opposed to other FPGA technologies, they can be reconfigured (even in the field) an unlimited number of times. Moreover, Xilinx SRAM-based FPGAs feature Dynamic Partial Reconfiguration (DPR), which allows to partially reconfigure the FPGA without disrupting de application. Finally, they feature a high logic density, high processing capability and a rich set of hard macros. However, one limitation of this technology is its susceptibility to ionizing radiation, which increases with technology scaling (smaller geometries, lower voltages and higher frequencies). This is a first order concern for applications in harsh radiation environments and requiring high dependability. Ionizing radiation leads to long term degradation as well as instantaneous faults, which can in turn be reversible or produce irreversible damage. In SRAM-based FPGAs, radiation-induced faults can appear at two architectural layers, which are physically overlaid on the silicon die. The Application Layer (or A-Layer) contains the user-defined hardware, and the Configuration Layer (or C-Layer) contains the (volatile) configuration memory and its support circuitry. Faults at either layers can imply a system failure, which may be more ore less tolerated depending on the dependability requirements. In the general case, such faults must be managed in some way. This thesis is about managing SRAM-based FPGA faults at system level, in the context of autonomous and dependable embedded systems operating in a radiative environment. The focus is mainly on space applications, but the same principles can be applied to ground applications. The main differences between them are the radiation level and the possibility for maintenance. The different techniques for A-Layer and C-Layer fault management are classified and their implications in system dependability are assessed. Several architectures are proposed, both for single-layer and dual-layer Fault Managers. For the latter, a novel, flexible and versatile architecture is proposed. It manages both layers concurrently in a coordinated way, and allows balancing redundancy level and dependability. For the purpose of validating dynamic fault management techniques, two different solutions are developed. The first one is a simulation framework for C-Layer Fault Managers, based on SystemC as modeling language and event-driven simulator. This framework and its associated methodology allows exploring the Fault Manager design space, decoupling its design from the target FPGA development. The framework includes models for both the FPGA C-Layer and for the Fault Manager, which can interact at different abstraction levels (at configuration frame level and at JTAG or SelectMAP physical level). The framework is configurable, scalable and versatile, and includes fault injection capabilities. Simulation results for some scenarios are presented and discussed. The second one is a validation platform for Xilinx Virtex FPGA Fault Managers. The platform hosts three Xilinx Virtex-4 FX12 FPGA Modules and two general-purpose 32-bit Microcontroller Unit (MCU) Modules. The MCU Modules allow prototyping software-based CLayer and A-Layer Fault Managers. Each FPGA Module implements one A-Layer Ethernet link (through an Ethernet switch) with one of the MCU Modules, and one C-Layer JTAG link with the other. In addition, both MCU Modules exchange commands and data over an internal UART link. Similarly to the simulation framework, fault injection capabilities are implemented. Test results for some scenarios are also presented and discussed. In summary, this thesis covers the whole process from describing the problem of radiationinduced faults in SRAM-based FPGAs, then identifying and classifying fault management techniques, then proposing Fault Manager architectures and finally validating them by simulation and test. The proposed future work is mainly related to the implementation of radiation-hardened System Fault Managers.
Resumo:
The proposed Endangered Species Act listing of the gopher tortoise has the potential to impact the military mission at installations in the southeastern United States. Candidate Conservation Agreements with the U.S. Fish and Wildlife Service could be a tool to promote conservation and potentially preclude listing. This project identified military activities that could be affected and determined that military natural resources managers are unsure if such an agreement would prevent impacts to the military mission or impose the same restrictions as federal listing. This project found that if a gopher tortoise Candidate Conservation Agreement can be developed such that it benefits the species as well as the military, it should be used as a model for other species.