988 resultados para soil data requirements


Relevância:

80.00% 80.00%

Publicador:

Resumo:

RESUMEN El apoyo a la selección de especies a la restauración de la vegetación en España en los últimos 40 años se ha basado fundamentalmente en modelos de distribución de especies, también llamados modelos de nicho ecológico, que estiman la probabilidad de presencia de las especies en función de las condiciones del medio físico (clima, suelo, etc.). Con esta tesis se ha intentado contribuir a la mejora de la capacidad predictiva de los modelos introduciendo algunas propuestas metodológicas adaptadas a los datos disponibles actualmente en España y enfocadas al uso de los modelos en la selección de especies. No siempre se dispone de datos a una resolución espacial adecuada para la escala de los proyectos de restauración de la vegetación. Sin embrago es habitual contar con datos de baja resolución espacial para casi todas las especies vegetales presentes en España. Se propone un método de recalibración que actualiza un modelo de regresión logística de baja resolución espacial con una nueva muestra de alta resolución espacial. El método permite obtener predicciones de calidad aceptable con muestras relativamente pequeñas (25 presencias de la especie) frente a las muestras mucho mayores (más de 100 presencias) que requería una estrategia de modelización convencional que no usara el modelo previo. La selección del método estadístico puede influir decisivamente en la capacidad predictiva de los modelos y por esa razón la comparación de métodos ha recibido mucha atención en la última década. Los estudios previos consideraban a la regresión logística como un método inferior a técnicas más modernas como las de máxima entropía. Los resultados de la tesis demuestran que esa diferencia observada se debe a que los modelos de máxima entropía incluyen técnicas de regularización y la versión de la regresión logística usada en las comparaciones no. Una vez incorporada la regularización a la regresión logística usando penalización, las diferencias en cuanto a capacidad predictiva desaparecen. La regresión logística penalizada es, por tanto, una alternativa más para el ajuste de modelos de distribución de especies y está a la altura de los métodos modernos con mejor capacidad predictiva como los de máxima entropía. A menudo, los modelos de distribución de especies no incluyen variables relativas al suelo debido a que no es habitual que se disponga de mediciones directas de sus propiedades físicas o químicas. La incorporación de datos de baja resolución espacial proveniente de mapas de suelo nacionales o continentales podría ser una alternativa. Los resultados de esta tesis sugieren que los modelos de distribución de especies de alta resolución espacial mejoran de forma ligera pero estadísticamente significativa su capacidad predictiva cuando se incorporan variables relativas al suelo procedente de mapas de baja resolución espacial. La validación es una de las etapas fundamentales del desarrollo de cualquier modelo empírico como los modelos de distribución de especies. Lo habitual es validar los modelos evaluando su capacidad predictiva especie a especie, es decir, comparando en un conjunto de localidades la presencia o ausencia observada de la especie con las predicciones del modelo. Este tipo de evaluación no responde a una cuestión clave en la restauración de la vegetación ¿cuales son las n especies más idóneas para el lugar a restaurar? Se ha propuesto un método de evaluación de modelos adaptado a esta cuestión que consiste en estimar la capacidad de un conjunto de modelos para discriminar entre las especies presentes y ausentes de un lugar concreto. El método se ha aplicado con éxito a la validación de 188 modelos de distribución de especies leñosas orientados a la selección de especies para la restauración de la vegetación en España. Las mejoras metodológicas propuestas permiten mejorar la capacidad predictiva de los modelos de distribución de especies aplicados a la selección de especies en la restauración de la vegetación y también permiten ampliar el número de especies para las que se puede contar con un modelo que apoye la toma de decisiones. SUMMARY During the last 40 years, decision support tools for plant species selection in ecological restoration in Spain have been based on species distribution models (also called ecological niche models), that estimate the probability of occurrence of the species as a function of environmental predictors (e.g., climate, soil). In this Thesis some methodological improvements are proposed to contribute to a better predictive performance of such models, given the current data available in Spain and focusing in the application of the models to selection of species for ecological restoration. Fine grained species distribution data are required to train models to be used at the scale of the ecological restoration projects, but this kind of data are not always available for every species. On the other hand, coarse grained data are available for almost every species in Spain. A recalibration method is proposed that updates a coarse grained logistic regression model using a new fine grained updating sample. The method allows obtaining acceptable predictive performance with reasonably small updating sample (25 occurrences of the species), in contrast with the much larger samples (more than 100 occurrences) required for a conventional modeling approach that discards the coarse grained data. The choice of the statistical method may have a dramatic effect on model performance, therefore comparisons of methods have received much interest in the last decade. Previous studies have shown a poorer performance of the logistic regression compared to novel methods like maximum entropy models. The results of this Thesis show that the observed difference is caused by the fact that maximum entropy models include regularization techniques and the versions of logistic regression compared do not. Once regularization has been added to the logistic regression using a penalization procedure, the differences in model performance disappear. Therefore, penalized logistic regression may be considered one of the best performing methods to model species distributions. Usually, species distribution models do not consider soil related predictors because direct measurements of the chemical or physical properties are often lacking. The inclusion of coarse grained soil data from national or continental soil maps could be a reasonable alternative. The results of this Thesis suggest that the performance of the models slightly increase after including soil predictors form coarse grained soil maps. Model validation is a key stage of the development of empirical models, such as species distribution models. The usual way of validating is based on the evaluation of model performance for each species separately, i.e., comparing observed species presences or absence to predicted probabilities in a set of sites. This kind of evaluation is not informative for a common question in ecological restoration projects: which n species are the most suitable for the environment of the site to be restored? A method has been proposed to address this question that estimates the ability of a set of models to discriminate among present and absent species in a evaluation site. The method has been successfully applied to the validation of 188 species distribution models used to support decisions on species selection for ecological restoration in Spain. The proposed methodological approaches improve the predictive performance of the predictive models applied to species selection in ecological restoration and increase the number of species for which a model that supports decisions can be fitted.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El trabajo se enmarca dentro de los proyecto INTEGRATE y EURECA, cuyo objetivo es el desarrollo de una capa de interoperabilidad semántica que permita la integración de datos e investigación clínica, proporcionando una plataforma común que pueda ser integrada en diferentes instituciones clínicas y que facilite el intercambio de información entre las mismas. De esta manera se promueve la mejora de la práctica clínica a través de la cooperación entre instituciones de investigación con objetivos comunes. En los proyectos se hace uso de estándares y vocabularios clínicos ya existentes, como pueden ser HL7 o SNOMED, adaptándolos a las necesidades particulares de los datos con los que se trabaja en INTEGRATE y EURECA. Los datos clínicos se representan de manera que cada concepto utilizado sea único, evitando ambigüedades y apoyando la idea de plataforma común. El alumno ha formado parte de un equipo de trabajo perteneciente al Grupo de Informática de la UPM, que a su vez trabaja como uno de los socios de los proyectos europeos nombrados anteriormente. La herramienta desarrollada, tiene como objetivo realizar tareas de homogenización de la información almacenada en las bases de datos de los proyectos haciendo uso de los mecanismos de normalización proporcionados por el vocabulario médico SNOMED-CT. Las bases de datos normalizadas serán las utilizadas para llevar a cabo consultas por medio de servicios proporcionados en la capa de interoperabilidad, ya que contendrán información más precisa y completa que las bases de datos sin normalizar. El trabajo ha sido realizado entre el día 12 de Septiembre del año 2014, donde comienza la etapa de formación y recopilación de información, y el día 5 de Enero del año 2015, en el cuál se termina la redacción de la memoria. El ciclo de vida utilizado ha sido el de desarrollo en cascada, en el que las tareas no comienzan hasta que la etapa inmediatamente anterior haya sido finalizada y validada. Sin embargo, no todas las tareas han seguido este modelo, ya que la realización de la memoria del trabajo se ha llevado a cabo de manera paralela con el resto de tareas. El número total de horas dedicadas al Trabajo de Fin de Grado es 324. Las tareas realizadas y el tiempo de dedicación de cada una de ellas se detallan a continuación:  Formación. Etapa de recopilación de información necesaria para implementar la herramienta y estudio de la misma [30 horas.  Especificación de requisitos. Se documentan los diferentes requisitos que ha de cumplir la herramienta [20 horas].  Diseño. En esta etapa se toman las decisiones de diseño de la herramienta [35 horas].  Implementación. Desarrollo del código de la herramienta [80 horas].  Pruebas. Etapa de validación de la herramienta, tanto de manera independiente como integrada en los proyectos INTEGRATE y EURECA [70 horas].  Depuración. Corrección de errores e introducción de mejoras de la herramienta [45 horas].  Realización de la memoria. Redacción de la memoria final del trabajo [44 horas].---ABSTRACT---This project belongs to the semantic interoperability layer developed in the European projects INTEGRATE and EURECA, which aims to provide a platform to promote interchange of medical information from clinical trials to clinical institutions. Thus, research institutions may cooperate to enhance clinical practice. Different health standards and clinical terminologies has been used in both INTEGRATE and EURECA projects, e.g. HL7 or SNOMED-CT. These tools have been adapted to the projects data requirements. Clinical data are represented by unique concepts, avoiding ambiguity problems. The student has been working in the Biomedical Informatics Group from UPM, partner of the INTEGRATE and EURECA projects. The tool developed aims to perform homogenization tasks over information stored in databases of the project, through normalized representation provided by the SNOMED-CT terminology. The data query is executed against the normalized version of the databases, since the information retrieved will be more informative than non-normalized databases. The project has been performed from September 12th of 2014, when initiation stage began, to January 5th of 2015, when the final report was finished. The waterfall model for software development was followed during the working process. Therefore, a phase may not start before the previous one finishes and has been validated, except from the final report redaction, which has been carried out in parallel with the others phases. The tasks that have been developed and time for each one are detailed as follows:  Training. Gathering the necessary information to develop the tool [30 hours].  Software requirement specification. Requirements the tool must accomplish [20 hours].  Design. Decisions on the design of the tool [35 hours].  Implementation. Tool development [80 hours].  Testing. Tool evaluation within the framework of the INTEGRATE and EURECA projects [70 hours].  Debugging. Improve efficiency and correct errors [45 hours].  Documenting. Final report elaboration [44 hours].

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study aims to assess the performance or multi-layer canopy parameterizations implemented in the mesoscale WRF model in order to understand their potential contribution to improve the description of energy fluxes and wind fields in the Madrid city. It was found that the Building Energy Model (BEP+BEM) parameterization yielded better results than the bulk standard scheme implemented in the Noah LSM, but very close to those of the Building Energy Parameterization (BEP). The later was deemed as the best option since data requirements and CPU time were smaller. Two annual runs were made to feed the CMAQ chemical-transport model to assess the impact of this feature in routinely air quality modelling activities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The location and density of biologically useful energy sources on Mars will limit the biomass, spatial distribution, and organism size of any biota. Subsurface Martian organisms could be supplied with a large energy flux from the oxidation of photochemically produced atmospheric H2 and CO diffusing into the regolith. However, surface abundance measurements of these gases demonstrate that no more than a few percent of this available flux is actually being consumed, suggesting that biological activity driven by atmospheric H2 and CO is limited in the top few hundred meters of the subsurface. This is significant because the available but unused energy is extremely large: for organisms at 30-m depth, it is 2,000 times previous estimates of hydrothermal and chemical weathering energy and far exceeds the energy derivable from other atmospheric gases. This also implies that the apparent scarcity of life on Mars is not attributable to lack of energy. Instead, the availability of liquid water may be a more important factor limiting biological activity because the photochemical energy flux can only penetrate to 100- to 1,000-m depth, where most H2O is probably frozen. Because both atmospheric and Viking lander soil data provide little evidence for biological activity, the detection of short-lived trace gases will probably be a better indicator of any extant Martian life.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introduction. This Policy Brief follows-up on the DIA-CORE Policy Brief on “Assessing costs and benefits of deploying renewables”, dated 26 September 2014, which highlighted the complexities in making a comprehensive and appropriate assessment of costs and benefits resulting from an increased use of renewable energy sources (RES). It distinguished the different types of effects into system-related effects, distributional effects and macro-economic effects, and looked at the related data requirements, which need to be comprehensive and standardised. This DIA-CORE Policy Brief uses the tools proposed in the previous Policy Brief to estimate the effects on Member States of reaching the EU-wide RES target of 27% of the EU’s energy consumption by 2030. This allows to draw some conclusions on the differentiated impacts across Member States, and the potential implications for an effort sharing approach. It also assesses whether a higher ambition level could be beneficial. The paper also takes into account the implications of national policy frameworks and highlights the importance of reforms to reduce the costs of RES adoption.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Systems biology is based on computational modelling and simulation of large networks of interacting components. Models may be intended to capture processes, mechanisms, components and interactions at different levels of fidelity. Input data are often large and geographically disperse, and may require the computation to be moved to the data, not vice versa. In addition, complex system-level problems require collaboration across institutions and disciplines. Grid computing can offer robust, scaleable solutions for distributed data, compute and expertise. We illustrate some of the range of computational and data requirements in systems biology with three case studies: one requiring large computation but small data (orthologue mapping in comparative genomics), a second involving complex terabyte data (the Visible Cell project) and a third that is both computationally and data-intensive (simulations at multiple temporal and spatial scales). Authentication, authorisation and audit systems are currently not well scalable and may present bottlenecks for distributed collaboration particularly where outcomes may be commercialised. Challenges remain in providing lightweight standards to facilitate the penetration of robust, scalable grid-type computing into diverse user communities to meet the evolving demands of systems biology.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTAMAP is a web processing service for the automatic interpolation of measured point data. Requirements were (i) using open standards for spatial data such as developed in the context of the open geospatial consortium (OGC), (ii) using a suitable environment for statistical modelling and computation, and (iii) producing an open source solution. The system couples the 52-North web processing service, accepting data in the form of an observations and measurements (O&M) document with a computing back-end realized in the R statistical environment. The probability distribution of interpolation errors is encoded with UncertML, a new markup language to encode uncertain data. Automatic interpolation needs to be useful for a wide range of applications and the algorithms have been designed to cope with anisotropies and extreme values. In the light of the INTAMAP experience, we discuss the lessons learnt.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The absence of a definitive approach to the design of manufacturing systems signifies the importance of a control mechanism to ensure the timely application of relevant design techniques. To provide effective control, design development needs to be continually assessed in relation to the required system performance, which can only be achieved analytically through computer simulation. The technique providing the only method of accurately replicating the highly complex and dynamic interrelationships inherent within manufacturing facilities and realistically predicting system behaviour. Owing to the unique capabilities of computer simulation, its application should support and encourage a thorough investigation of all alternative designs. Allowing attention to focus specifically on critical design areas and enabling continuous assessment of system evolution. To achieve this system analysis needs to efficient, in terms of data requirements and both speed and accuracy of evaluation. To provide an effective control mechanism a hierarchical or multi-level modelling procedure has therefore been developed, specifying the appropriate degree of evaluation support necessary at each phase of design. An underlying assumption of the proposal being that evaluation is quick, easy and allows models to expand in line with design developments. However, current approaches to computer simulation are totally inappropriate to support the hierarchical evaluation. Implementation of computer simulation through traditional approaches is typically characterized by a requirement for very specialist expertise, a lengthy model development phase, and a correspondingly high expenditure. Resulting in very little and rather inappropriate use of the technique. Simulation, when used, is generally only applied to check or verify a final design proposal. Rarely is the full potential of computer simulation utilized to aid, support or complement the manufacturing system design procedure. To implement the proposed modelling procedure therefore the concept of a generic simulator was adopted, as such systems require no specialist expertise, instead facilitating quick and easy model creation, execution and modification, through simple data inputs. Previously generic simulators have tended to be too restricted, lacking the necessary flexibility to be generally applicable to manufacturing systems. Development of the ATOMS manufacturing simulator, however, has proven that such systems can be relevant to a wide range of applications, besides verifying the benefits of multi-level modelling.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Soil erosion is one of the most pressing issues facing developing countries. The need for soil erosion assessment is paramount as a successful and productive agricultural base is necessary for economic growth and stability. In Ghana, a country with an expanding population and high potential for economic growth, agriculture is an important resource; however, most of the crop production is restricted to low technology shifting cultivation agriculture. The high intensity seasonal rainfall coincides with the early growing period of many of the crops meaning that plots are very susceptible to erosion, especially on steep sided valleys in the region south of Lake Volta. This research investigated the processes of soil erosion by rainfall with the aim of producing a sediment yield model for a small semi-agricultural catchment in rural Ghana. Various types of modelling techniques were considered to discover those most applicable to the sub-tropical environment of Southern Ghana. Once an appropriate model had been developed and calibrated, the aim was to look at how to enable the scaling up of the model using sub-catchments to calculate sedimentation rates of Lake Volta. An experimental catchment was located in Ghana, south west of Lake Volta, where data on rainstorms and the associated streamflow, sediment loads and soil data (moisture content, classification and particle size distribution) was collected to calibrate the model. Additional data was obtained from the Soil Research Institute in Ghana to explore calibration of the Universal Soil Loss Equation (USLE, Wischmeier and Smith, 1978) for Ghanaian soils and environment. It was shown that the USLE could be successfully converted to provide meaningful soil loss estimates in the Ghanaian environment. However, due to experimental difficulties, the proposed theory and methodology of the sediment yield model could only be tested in principle. Future work may include validation of the model and subsequent scaling up to estimate sedimentation rates in Lake Volta.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTAMAP is a Web Processing Service for the automatic spatial interpolation of measured point data. Requirements were (i) using open standards for spatial data such as developed in the context of the Open Geospatial Consortium (OGC), (ii) using a suitable environment for statistical modelling and computation, and (iii) producing an integrated, open source solution. The system couples an open-source Web Processing Service (developed by 52°North), accepting data in the form of standardised XML documents (conforming to the OGC Observations and Measurements standard) with a computing back-end realised in the R statistical environment. The probability distribution of interpolation errors is encoded with UncertML, a markup language designed to encode uncertain data. Automatic interpolation needs to be useful for a wide range of applications and the algorithms have been designed to cope with anisotropy, extreme values, and data with known error distributions. Besides a fully automatic mode, the system can be used with different levels of user control over the interpolation process.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation delivers a framework to diagnose the Bull-Whip Effect (BWE) in supply chains and then identify methods to minimize it. Such a framework is needed because in spite of the significant amount of literature discussing the bull-whip effect, many companies continue to experience the wide variations in demand that are indicative of the bull-whip effect. While the theory and knowledge of the bull-whip effect is well established, there still is the lack of an engineering framework and method to systematically identify the problem, diagnose its causes, and identify remedies. ^ The present work seeks to fill this gap by providing a holistic, systems perspective to bull-whip identification and diagnosis. The framework employs the SCOR reference model to examine the supply chain processes with a baseline measure of demand amplification. Then, research of the supply chain structural and behavioral features is conducted by means of the system dynamics modeling method. ^ The contribution of the diagnostic framework, is called Demand Amplification Protocol (DAMP), relies not only on the improvement of existent methods but also contributes with original developments introduced to accomplish successful diagnosis. DAMP contributes a comprehensive methodology that captures the dynamic complexities of supply chain processes. The method also contributes a BWE measurement method that is suitable for actual supply chains because of its low data requirements, and introduces a BWE scorecard for relating established causes to a central BWE metric. In addition, the dissertation makes a methodological contribution to the analysis of system dynamic models with a technique for statistical screening called SS-Opt, which determines the inputs with the greatest impact on the bull-whip effect by means of perturbation analysis and subsequent multivariate optimization. The dissertation describes the implementation of the DAMP framework in an actual case study that exposes the approach, analysis, results and conclusions. The case study suggests a balanced solution between costs and demand amplification can better serve both firms and supply chain interests. Insights pinpoint to supplier network redesign, postponement in manufacturing operations and collaborative forecasting agreements with main distributors.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Requirements for space based monitoring of permafrost features had been already defined within the IGOS Cryosphere Theme Report at the start of the IPY in 2007 (IGOS, 2007). The WMO Polar Space Task Group (PSTG, http://www.wmo.int/pages/prog/sat/pstg_en.php) identified the need to review the requirements for permafrost monitoring and to update these requirements in 2013. Relevant surveys with focus on satellite data are already available from the ESA DUE Permafrost User requirements survey (2009), the United States National Research Council (2014) and the ESA - CliC - IPA - GTN -P workshop in February 2014. These reports have been reviewed and specific needs discussed within the community and a white paper submitted to the WMO PSTG. Acquisition requirements for monitoring of especially terrain changes (incl. rock glaciers and coastal erosion) and lakes (extent, ice properties etc.) with respect to current satellite missions have been specified. About 50 locations ('cold spots') where permafrost (Arctic and Antarctic) in situ monitoring has been taking place for many years or where field stations are currently established have been identified. These sites have been proposed to the WMO Polar Space Task Group as focus areas for future monitoring by high resolution satellite data. The specifications of these sites including meta-data on site instrumentation have been published as supplement to the white paper (Bartsch et al. 2014, doi:10.1594/PANGAEA.847003). The representativity of the 'cold spots' around the arctic has been in the following assessed based on a landscape units product which has been developed as part of the FP7 project PAGE21. The ESA DUE Permafrost service has been utilized to produce a pan-arctic database (25km, 2000-2014) comprising Mean Annual Surface Temperature, Annual and summer Amplitude of Surface Temperature, Mean Summer (July-August) Surface Temperature. Surface status (frozen/unfrozen) related products have been also derived from the ESA DUE Permafrost service. This includes the length of unfrozen period, first unfrozen day and first frozen day. In addition, SAR (ENVISAT ASAR GM) statistics as well as topographic parameters have been considered. The circumpolar datasets have been assessed for their redundancy in information content. 12 distinct units could be derived. The landscape units reveal similarities between North Slope Alaska and the region from the Yamal Peninsula to the Yenisei estuary. Northern Canada is characterized by the same landscape units like western Siberia. North-eastern Canada shows similarities to the Laptev coast region. This paper presents the result of this assessment and formulates recommendations for extensions of the in situ monitoring networks and categorizes the sites by satellite data requirements (specifically Sentinels) with respect to the landscape type and related processes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research was funded by the Cambridge Conservation Initiative Collaborative Fund for Conservation, and we thank its major sponsor Arcadia. We thank J. Bruinsma for the provision of demand data, the CEH for the provision of soil data and J. Spencer for invaluable discussions. A.L. was supported by a Gates Cambridge Scholarship. T.B., K.G. and J.P. acknowledge BBSRC funding through grant BBS/E/C/00005198.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Antillean manatees (Trichechus manatus manatus) were heavily hunted in the past throughout the Wider Caribbean Region (WCR), and are currently listed as endangered on the IUCN Red List of Threatened Species. In most WCR countries, including Haiti and the Dominican Republic, remaining manatee populations are believed to be small and declining, but current information is needed on their status, distribution, and local threats to the species.

To assess the past and current distribution and conservation status of the Antillean manatee in Hispaniola, I conducted a systematic review of documentary archives dating from the pre-Columbian era to 2013. I then surveyed more than 670 artisanal fishers from Haiti and the Dominican Republic in 2013-2014 using a standardized questionnaire. Finally, to identify important areas for manatees in the Dominican Republic, I developed a country-wide ensemble model of manatee distribution, and compared modeled hotspots with those identified by fishers.

Manatees were historically abundant in Hispaniola, but were hunted for their meat and became relatively rare by the end of the 19th century. The use of manatee body parts diversified with time to include their oil, skin, and bones. Traditional uses for folk medicine and handcrafts persist today in coastal communities in the Dominican Republic. Most threats to Antillean manatees in Hispaniola are anthropogenic in nature, and most mortality is caused by fisheries. I estimated a minimum island-wide annual mortality of approximately 20 animals. To understand the impact of this level of mortality, and to provide a baseline for measuring the success of future conservation actions, the Dominican Republic and Haiti should work together to obtain a reliable estimate of the current population size of manatees in Hispaniola.

In Haiti, the survey of fishers showed a wider distribution range of the species than suggested by the documentary archive review: fishers reported recent manatee sightings in seven of nine coastal departments, and three manatee hotspot areas were identified in the north, central, and south coasts. Thus, the contracted manatee distribution range suggested by the documentary archive review likely reflects a lack of research in Haiti. Both the review and the interviews agreed that manatees no longer occupy freshwater habitats in the country. In general, more dedicated manatee studies are needed in Haiti, employing aerial, land, or boat surveys.

In the Dominican Republic, the documentary archive review and the survey of fishers showed that manatees still occur throughout the country, and occasionally occupy freshwater habitats. Monte Cristi province in the north coast, and Barahona province in the south coast, were identified as focal areas. Sighting reports of manatees decreased from Monte Cristi eastwards to the adjacent province in the Dominican Republic, and westwards into Haiti. Along the north coast of Haiti, the number of manatee sighting and capture reports decreased with increasing distance to Monte Cristi province. There was good agreement among the modeled manatee hotspots, hotspots identified by fishers, and hotspots identified during previous dedicated manatee studies. The concordance of these results suggests that the distribution and patterns of habitat use of manatees in the Dominican Republic have not changed dramatically in over 30 years, and that the remaining manatees exhibit some degree of site fidelity. The ensemble modeling approach used in the present study produced accurate and detailed maps of manatee distribution with minimum data requirements. This modeling strategy is replicable and readily transferable to other countries in the Caribbean or elsewhere with limited data on a species of interest.

The intrinsic value of manatees was stronger for artisanal fishers in the Dominican Republic than in Haiti, and most Dominican fishers showed a positive attitude towards manatee conservation. The Dominican Republic is an upper middle income country with a high Human Development Index. It possesses a legal framework that specifically protects manatees, and has a greater number of marine protected areas, more dedicated manatee studies, and more manatee education and awareness campaigns than Haiti. The constant presence of manatees in specific coastal segments of the Dominican Republic, the perceived decline in the number of manatee captures, and a more conservation-minded public, offer hope for manatee conservation, as non-consumptive uses of manatees become more popular. I recommend a series of conservation actions in the Dominican Republic, including: reducing risks to manatees from harmful fishing gear and watercraft at confirmed manatee hotspots; providing alternative economic alternatives for displaced fishers, and developing responsible ecotourism ventures for manatee watching; improving law enforcement to reduce fisheries-related manatee deaths, stop the illegal trade in manatee body parts, and better protect manatee habitat; and continuing education and awareness campaigns for coastal communities near manatee hotspots.

In contrast, most fishers in Haiti continue to value manatees as a source of food and income, and showed a generally negative attitude towards manatee conservation. Haiti is a low income country with a low Human Development Index. Only a single dedicated manatee study has been conducted in Haiti, and manatees are not officially protected. Positive initiatives for manatees in Haiti include: protected areas declared in 2013 and 2014 that enclose two of the manatee hotspots identified in the present study; and local organizations that are currently working on coastal and marine environmental issues, including research and education on marine mammals. Future conservation efforts for manatees in Haiti should focus on addressing poverty and providing viable economic alternatives for coastal communities. I recommend a community partnership approach for manatee conservation, paired with education and awareness campaigns to inform coastal communities about the conservation situation of manatees in Haiti, and to help change their perceived value. Haiti should also provide legal protection for manatees and their habitat.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The distribution, abundance, behaviour, and morphology of marine species is affected by spatial variability in the wave environment. Maps of wave metrics (e.g. significant wave height Hs, peak energy wave period Tp, and benthic wave orbital velocity URMS) are therefore useful for predictive ecological models of marine species and ecosystems. A number of techniques are available to generate maps of wave metrics, with varying levels of complexity in terms of input data requirements, operator knowledge, and computation time. Relatively simple "fetch-based" models are generated using geographic information system (GIS) layers of bathymetry and dominant wind speed and direction. More complex, but computationally expensive, "process-based" models are generated using numerical models such as the Simulating Waves Nearshore (SWAN) model. We generated maps of wave metrics based on both fetch-based and process-based models and asked whether predictive performance in models of benthic marine habitats differed. Predictive models of seagrass distribution for Moreton Bay, Southeast Queensland, and Lizard Island, Great Barrier Reef, Australia, were generated using maps based on each type of wave model. For Lizard Island, performance of the process-based wave maps was significantly better for describing the presence of seagrass, based on Hs, Tp, and URMS. Conversely, for the predictive model of seagrass in Moreton Bay, based on benthic light availability and Hs, there was no difference in performance using the maps of the different wave metrics. For predictive models where wave metrics are the dominant factor determining ecological processes it is recommended that process-based models be used. Our results suggest that for models where wave metrics provide secondarily useful information, either fetch- or process-based models may be equally useful.