963 resultados para Fish populations -- Data processing
Resumo:
Implementation of GEOSS/GMES initiative requires creation and integration of service providers, most of which provide geospatial data output from Grid system to interactive user. In this paper approaches of DOS- centers (service providers) integration used in Ukrainian segment of GEOSS/GMES will be considered and template solutions for geospatial data visualization subsystems will be suggested. Developed patterns are implemented in DOS center of Space Research Institute of National Academy of Science of Ukraine and National Space Agency of Ukraine (NASU-NSAU).
Resumo:
This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.
Resumo:
Data processing services for Meteosat geostationary satellite are presented. Implemented services correspond to the different levels of remote-sensing data processing, including noise reduction at preprocessing level, cloud mask extraction at low-level and fractal dimension estimation at high-level. Cloud mask obtained as a result of Markovian segmentation of infrared data. To overcome high computation complexity of Markovian segmentation parallel algorithm is developed. Fractal dimension of Meteosat data estimated using fractional Brownian motion models.
Resumo:
As massive data sets become increasingly available, people are facing the problem of how to effectively process and understand these data. Traditional sequential computing models are giving way to parallel and distributed computing models, such as MapReduce, both due to the large size of the data sets and their high dimensionality. This dissertation, as in the same direction of other researches that are based on MapReduce, tries to develop effective techniques and applications using MapReduce that can help people solve large-scale problems. Three different problems are tackled in the dissertation. The first one deals with processing terabytes of raster data in a spatial data management system. Aerial imagery files are broken into tiles to enable data parallel computation. The second and third problems deal with dimension reduction techniques that can be used to handle data sets of high dimensionality. Three variants of the nonnegative matrix factorization technique are scaled up to factorize matrices of dimensions in the order of millions in MapReduce based on different matrix multiplication implementations. Two algorithms, which compute CANDECOMP/PARAFAC and Tucker tensor decompositions respectively, are parallelized in MapReduce based on carefully partitioning the data and arranging the computation to maximize data locality and parallelism.
Resumo:
Acknowledgments The authors wish to thank the crews, fishermen and scientists who conducted the various surveys from which data were obtained, and Mark Belchier and Simeon Hill for their contributions. This work was supported by the Government of South Georgia and South Sandwich Islands. Additional logistical support provided by The South Atlantic Environmental Research Institute with thanks to Paul Brickle. Thanks to Stephen Smith of Fisheries and Oceans Canada (DFO) for help in constructing bootstrap confidence limits. Paul Fernandes receives funding from the MASTS pooling initiative (The Marine Alliance for Science and Technology for Scotland), and their support is gratefully acknowledged. MASTS is funded by the Scottish Funding Council (grant reference HR09011) and contributing institutions. We also wish to thank two anonymous referees for their helpful suggestions on earlier versions of this manuscript.
Resumo:
Four marine fish species are among the most important on the world market: cod, salmon, tuna, and sea bass. While the supply of North American and European markets for two of these species - Atlantic salmon and European sea bass - mainly comes from fish farming, Atlantic cod and tunas are mainly caught from wild stocks. We address the question what will be the status of these wild stocks in the midterm future, in the year 2048, to be specific. Whereas the effects of climate change and ecological driving forces on fish stocks have already gained much attention, our prime interest is in studying the effects of changing economic drivers, as well as the impact of variable management effectiveness. Using a process-based ecological-economic multispecies optimization model, we assess the future stock status under different scenarios of change. We simulate (i) technological progress in fishing, (ii) increasing demand for fish, and (iii) increasing supply of farmed fish, as well as the interplay of these driving forces under different sce- narios of (limited) fishery management effectiveness. We find that economic change has a substantial effect on fish populations. Increasing aquaculture production can dampen the fishing pressure on wild stocks, but this effect is likely to be overwhelmed by increasing demand and technological progress, both increasing fishing pressure. The only solution to avoid collapse of the majority of stocks is institutional change to improve management effectiveness significantly above the current state. We conclude that full recognition of economic drivers of change will be needed to successfully develop an integrated ecosystem management and to sustain the wild fish stocks until 2048 and beyond.
Resumo:
The generation of heterogeneous big data sources with ever increasing volumes, velocities and veracities over the he last few years has inspired the data science and research community to address the challenge of extracting knowledge form big data. Such a wealth of generated data across the board can be intelligently exploited to advance our knowledge about our environment, public health, critical infrastructure and security. In recent years we have developed generic approaches to process such big data at multiple levels for advancing decision-support. It specifically concerns data processing with semantic harmonisation, low level fusion, analytics, knowledge modelling with high level fusion and reasoning. Such approaches will be introduced and presented in context of the TRIDEC project results on critical oil and gas industry drilling operations and also the ongoing large eVacuate project on critical crowd behaviour detection in confined spaces.
Resumo:
The Data Processing Department of ISHC has developed coding forms to be used for the data to be entered into the program. The Highway Planning and Programming and the Design Departments are responsible for coding and submitting the necessary data forms to Data Processing for the noise prediction on the highway sections.
Resumo:
The purpose of the project is to demonstrate how the restoration of riverine habitat and connectivity benefits native biodiversity and promote the importance of a healthy river system for native fish and the greater river catchment. The goal is to restore native fish populations to 60% of pre-European settlement levels and improve aquatic health within the Reach.
Resumo:
Repeatability of behavioural and physiological traits is increasingly a focus for animal researchers, for which fish have become important models. Almost all of this work has been done in the context of evolutionary ecology, with few explicit attempts to apply repeatability and context dependency of trait variation toward understanding conservation-related issues. Here, we review work examining the degree to which repeatability of traits (such as boldness, swimming performance, metabolic rate and stress responsiveness) is context dependent. We review methods for quantifying repeatability (distinguishing between within-context and across-context repeatability) and confounding factors that may be especially problematic when attempting to measure repeatability in wild fish. Environmental factors such temperature, food availability, oxygen availability, hypercapnia, flow regime and pollutants all appear to alter trait repeatability in fishes. This suggests that anthropogenic environmental change could alter evolutionary trajectories by changing which individuals achieve the greatest fitness in a given set of conditions. Gaining a greater understanding of these effects will be crucial for our ability to forecast the effects of gradual environmental change, such as climate change and ocean acidification, the study of which is currently limited by our ability to examine trait changes over relatively short time scales. Also discussed are situations in which recent advances in technologies associated with electronic tags (biotelemetry and biologging) and respirometry will help to facilitate increased quantification of repeatability for physiological and integrative traits, which so far lag behind measures of repeatability of behavioural traits.
Resumo:
By providing vehicle-to-vehicle and vehicle-to-infrastructure wireless communications, vehicular ad hoc networks (VANETs), also known as the “networks on wheels”, can greatly enhance traffic safety, traffic efficiency and driving experience for intelligent transportation system (ITS). However, the unique features of VANETs, such as high mobility and uneven distribution of vehicular nodes, impose critical challenges of high efficiency and reliability for the implementation of VANETs. This dissertation is motivated by the great application potentials of VANETs in the design of efficient in-network data processing and dissemination. Considering the significance of message aggregation, data dissemination and data collection, this dissertation research targets at enhancing the traffic safety and traffic efficiency, as well as developing novel commercial applications, based on VANETs, following four aspects: 1) accurate and efficient message aggregation to detect on-road safety relevant events, 2) reliable data dissemination to reliably notify remote vehicles, 3) efficient and reliable spatial data collection from vehicular sensors, and 4) novel promising applications to exploit the commercial potentials of VANETs. Specifically, to enable cooperative detection of safety relevant events on the roads, the structure-less message aggregation (SLMA) scheme is proposed to improve communication efficiency and message accuracy. The scheme of relative position based message dissemination (RPB-MD) is proposed to reliably and efficiently disseminate messages to all intended vehicles in the zone-of-relevance in varying traffic density. Due to numerous vehicular sensor data available based on VANETs, the scheme of compressive sampling based data collection (CS-DC) is proposed to efficiently collect the spatial relevance data in a large scale, especially in the dense traffic. In addition, with novel and efficient solutions proposed for the application specific issues of data dissemination and data collection, several appealing value-added applications for VANETs are developed to exploit the commercial potentials of VANETs, namely general purpose automatic survey (GPAS), VANET-based ambient ad dissemination (VAAD) and VANET based vehicle performance monitoring and analysis (VehicleView). Thus, by improving the efficiency and reliability in in-network data processing and dissemination, including message aggregation, data dissemination and data collection, together with the development of novel promising applications, this dissertation will help push VANETs further to the stage of massive deployment.
Resumo:
The World Health Organization (WHO) MONICA Project is a 10-year study monitoring trends and determinants of cardiovascular disease in geographically defined populations. Data were collected from over 100 000 randomly selected participants in two risk factor surveys conducted approximately 5 years apart in 38 populations using standardized protocols. The net effects of changes in the risk factor levels were estimated using risk scores derived from longitudinal studies in the Nordic countries. The prevalence of cigarette smoking decreased among men in most populations, but the trends for women varied. The prevalence of hypertension declined in two-thirds of the populations. Changes in the prevalence of raised total cholesterol were small but highly correlated between the genders (r = 0.8). The prevalence of obesity increased in three-quarters of the populations for men and in more than half of the populations for women. In almost half of the populations there were statistically significant declines in the estimated coronary risk for both men and women, although for Beijing the risk score increased significantly for both genders. The net effect of the changes in the risk factor levels in the 1980s in most of the study populations of the WHO MONICA Project is that the rates of coronary disease are predicted to decline in the 1990s.
Resumo:
Carbon and nitrogen stable isotope ratios were measured in 157 fish bone collagen samples from 15 different archaeological sites in Belgium which ranged in ages from the 3rd to the 18th c. AD. Due to diagenetic contamination of the burial environment, only 63 specimens produced results with suitable C:N ratios (2.9–3.6). The selected bones encompass a wide spectrum of freshwater, brackish, and marine taxa (N = 18), and this is reflected in the δ13C results (−28.2‰ to −12.9%). The freshwater fish have δ13C values that range from −28.2‰ to −20.2‰, while the marine fish cluster between −15.4‰ and −13.0‰. Eel, a catadromous species (mostly living in freshwater but migrating into the sea to spawn), plots between −24.1‰ and −17.7‰, and the anadromous fish (living in marine environments but migrating into freshwater to spawn) show a mix of freshwater and marine isotopic signatures. The δ15N results also have a large range (7.2‰ to 16.7‰) indicating that these fish were feeding at many different trophic levels in these diverse aquatic environments. The aim of this research is the isotopic characterization of archaeological fish species (ecology, trophic level, migration patterns) and to determine intra-species variation within and between fish populations differing in time and location. Due to the previous lack of archaeological fish isotope data from Northern Europe and Belgium in particular, these results serve as an important ecological backdrop for the future isotopic reconstruction of the diet of human populations dating from the historical period (1st and 2nd millennium AD), where there is zooarchaeological and historical evidence for an increased consumption of marine fish.
Resumo:
O rio Corumbataí é um dos principais tributários da margem direita do rio Piracicaba que é um tributário do rio Tietê. O rio Corumbataí integra a bacia do rio Paraná e é regionalmente importante não só por possuir águas de boa qualidade, mas também por possuir elementos raros na paisagem local. Este estudo visou caracterizar as assembléias de peixes do rio Corumbataí e fornecer dados que contribuam para uma avaliação da sua qualidade ambiental. Na bacia do rio Corumbataí, foram amostrados 4 rios principais, cada um com 3 pontos de coleta. Vinte e quatro amostras foram coletadas durante os meses de março a julho e de setembro a dezembro de 2001. Dados bióticos foram avaliados por medidas de diversidade. Um modelo linear ANCOVA foi utilizado para testar a hipótese de variação espaço-temporal nas assembléias de peixes, com a riqueza de espécies como variável resposta, ordem do rio como fator e temperatura e logaritmo natural do número de indivíduos como covariáveis. Esta análise mostrou uma variação espaço-temporal que é corroborada por conceitos exaustivamente discutidos na literatura, tais como relação espécie-área e o conceito de rio contínuo. Dados provenientes do rio Ribeirão Claro mostraram um padrão diferente quando comparados com os outros rios. Esta diferença foi provavelmente devido à interferência humana e atesta o processo de fragmentação de hábitats aquáticos que podem ter levado a um isolamento das populações locais de peixes.
Resumo:
Until mid 2006, SCIAMACHY data processors for the operational retrieval of nitrogen dioxide (NO2) column data were based on the historical version 2 of the GOME Data Processor (GDP). On top of known problems inherent to GDP 2, ground-based validations of SCIAMACHY NO2 data revealed issues specific to SCIAMACHY, like a large cloud-dependent offset occurring at Northern latitudes. In 2006, the GDOAS prototype algorithm of the improved GDP version 4 was transferred to the off-line SCIAMACHY Ground Processor (SGP) version 3.0. In parallel, the calibration of SCIAMACHY radiometric data was upgraded. Before operational switch-on of SGP 3.0 and public release of upgraded SCIAMACHY NO2 data, we have investigated the accuracy of the algorithm transfer: (a) by checking the consistency of SGP 3.0 with prototype algorithms; and (b) by comparing SGP 3.0 NO2 data with ground-based observations reported by the WMO/GAW NDACC network of UV-visible DOAS/SAOZ spectrometers. This delta-validation study concludes that SGP 3.0 is a significant improvement with respect to the previous processor IPF 5.04. For three particular SCIAMACHY states, the study reveals unexplained features in the slant columns and air mass factors, although the quantitative impact on SGP 3.0 vertical columns is not significant.