963 resultados para Fish populations -- Data processing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accurate assessments of fish populations are often limited by re-observation or recapture events. Since the early 1990s, passive integrated transponders (PIT tags) have been used to understand the biology of many fish species. Until recently, PIT applications in small streams have been limited to physical recapture events. To maximize recapture probability, we constructed PIT antenna arrays in small streams to remotely detect individual fish. Experiences from two different laboratories (three case studies) allowed us to develop a unified approach to applying PIT technology for enhancing data assessments. Information on equipment, its installation, tag considerations, and array construction is provided. Theoretical and practical definitions are introduced to standardize metrics for assessing detection efficiency. We demonstrate how certain conditions (stream discharge, vibration, and ambient radio frequency noise) affect the detection efficiency and suggest that by monitoring these conditions, expectations of efficiency can be modified. We emphasize the importance of consistently estimating detection efficiency for fisheries applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 2014, UniDive (The University of Queensland Underwater Club) conducted an ecological assessment of the Point Lookout Dive sites for comparison with similar surveys conducted in 2001. Involvement in the project was voluntary. Members of UniDive who were marine experts conducted training for other club members who had no, or limited, experience in identifying marine organisms and mapping habitats. Since the 2001 detailed baseline study, no similar seasonal survey has been conducted. The 2014 data is particularly important given that numerous changes have taken place in relation to the management of, and potential impacts on, these reef sites. In 2009, Moreton Bay Marine Park was re-zoned, and Flat Rock was converted to a marine national park zone (Green zone) with no fishing or anchoring. In 2012, four permanent moorings were installed at Flat Rock. Additionally, the entire area was exposed to the potential effects of the 2011 and 2013 Queensland floods, including flood plumes which carried large quantities of sediment into Moreton Bay and surrounding waters. The population of South East Queensland has increased from 2.49 million in 2001 to 3.18 million in 2011 (BITRE, 2013). This rapidly expanding coastal population has increased the frequency and intensity of both commercial and recreational activities around Point Lookout dive sites (EPA 2008). Methodology used for the PLEA project was based on the 2001 survey protocols, Reef Check Australia protocols and Coral Watch methods. This hybrid methodology was used to monitor substrate and benthos, invertebrates, fish, and reef health impacts. Additional analyses were conducted with georeferenced photo transects. The PLEA marine surveys were conducted over six weekends in 2014 totaling 535 dives and 376 hours underwater. Two training weekends (February and March) were attended by 44 divers, whilst biological surveys were conducted on seasonal weekends (February, May, July and October). Three reefs were surveyed, with two semi-permanent transects at Flat Rock, two at Shag Rock, and one at Manta Ray Bommie. Each transect was sampled once every survey weekend, with the transect tapes deployed at a depth of 10 m below chart datum. Fish populations were assessed using a visual census along 3 x 20 m transects. Each transect was 5 m wide (2.5 m either side of the transect tape), 5 m high and 20 m in length. Fish families and species were chosen that are commonly targeted by recreational or commercial fishers, or targeted by aquarium collectors, and that were easily identified by their body shape. Rare or otherwise unusual species were also recorded. Target invertebrate populations were assessed using visual census along 3 x 20 m transects. Each transect was 5 m wide (2.5 m either side of the transect tape) and 20 m in length. The diver surveying invertebrates conducted a 'U-shaped' search pattern, covering 2.5 m on either side of the transect tape. Target impacts were assessed using a visual census along the 3 x 20 m transects. Each transect was 5 m wide (2.5 m either side of the transect tape) and 20 m in length. The transect was surveyed via a 'U-shaped' search pattern, covering 2.5 m on either side of the transect tape. Substrate surveys were conducted using the point sampling method, enabling percentage cover of substrate types and benthic organisms to be calculated. The substrate or benthos under the transect line was identified at 0.5m intervals, with a 5m gap between each of the three 20m segments. Categories recorded included various growth forms of hard and soft coral, key species/growth forms of algae, other living organisms (i.e. sponges), recently killed coral, and, non-living substrate types (i.e. bare rock, sand, rubble, silt/clay).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and purpose Survey data quality is a combination of the representativeness of the sample, the accuracy and precision of measurements, data processing and management with several subcomponents in each. The purpose of this paper is to show how, in the final risk factor surveys of the WHO MONICA Project, information on data quality were obtained, quantified, and used in the analysis. Methods and results In the WHO MONICA (Multinational MONItoring of trends and determinants in CArdiovascular disease) Project, the information about the data quality components was documented in retrospective quality assessment reports. On the basis of the documented information and the survey data, the quality of each data component was assessed and summarized using quality scores. The quality scores were used in sensitivity testing of the results both by excluding populations with low quality scores and by weighting the data by its quality scores. Conclusions Detailed documentation of all survey procedures with standardized protocols, training, and quality control are steps towards optimizing data quality. Quantifying data quality is a further step. Methods used in the WHO MONICA Project could be adopted to improve quality in other health surveys.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as “histogram binning” inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. ^ Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. ^ The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. ^ These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. ^ In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as "histogram binning" inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The advancement of GPS technology has made it possible to use GPS devices as orientation and navigation tools, but also as tools to track spatiotemporal information. GPS tracking data can be broadly applied in location-based services, such as spatial distribution of the economy, transportation routing and planning, traffic management and environmental control. Therefore, knowledge of how to process the data from a standard GPS device is crucial for further use. Previous studies have considered various issues of the data processing at the time. This paper, however, aims to outline a general procedure for processing GPS tracking data. The procedure is illustrated step-by-step by the processing of real-world GPS data of car movements in Borlänge in the centre of Sweden.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Endogenous and environmental variables are fundamental in explaining variations in fish condition. Based on more than 20 yr of fish weight and length data, relative condition indices were computed for anchovy and sardine caught in the Gulf of Lions. Classification and regression trees (CART) were used to identify endogenous factors affecting fish condition, and to group years of similar condition. Both species showed a similar annual cycle with condition being minimal in February and maximal in July. CART identified 3 groups of years where the fish populations generally showed poor, average and good condition and within which condition differed between age classes but not according to sex. In particular, during the period of poor condition (mostly recent years), sardines older than 1 yr appeared to be more strongly affected than younger individuals. Time-series were analyzed using generalized linear models (GLMs) to examine the effects of oceanographic abiotic (temperature, Western Mediterranean Oscillation [WeMO] and Rhone outflow) and biotic (chlorophyll a and 6 plankton classes) factors on fish condition. The selected models explained 48 and 35% of the variance of anchovy and sardine condition, respectively. Sardine condition was negatively related to temperature but positively related to the WeMO and mesozooplankton and diatom concentrations. A positive effect of mesozooplankton and Rhone runoff on anchovy condition was detected. The importance of increasing temperatures and reduced water mixing in the NW Mediterranean Sea, affecting planktonic productivity and thus fish condition by bottom-up control processes, was highlighted by these results. Changes in plankton quality, quantity and phenology could lead to insufficient or inadequate food supply for both species.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate change challenges the capacity of fishes to thrive in their habitat. However, through phenotypic diversity, they demonstrate remarkable resilience to deteriorating conditions. In fish populations, inter-individual variation in a number of fitness-determining physiological traits, including cardiac performance, is classically observed. Information about the cellular bases of inter-individual variability in cardiac performance is scarce including the possible contribution of excitation-contraction (EC) coupling. This study aimed at providing insight into EC coupling-related Ca2+ response and thermal plasticity in the European sea bass (Dicentrarchus labrax). A cell population approach was used to lay the methodological basis for identifying the cellular determinants of cardiac performance. Fish were acclimated at 12 and 22 A degrees C and changes in intracellular calcium concentration ([Ca2+](i)) following KCl stimulation were measured using Fura-2, at 12 or 22 A degrees C-test. The increase in [Ca2+](i) resulted primarily from extracellular Ca2+ entry but sarcoplasmic reticulum stores were also shown to be involved. As previously reported in sea bass, a modest effect of adrenaline was observed. Moreover, although the response appeared relatively insensitive to an acute temperature change, a difference in Ca2+ response was observed between 12- and 22 A degrees C-acclimated fish. In particular, a greater increase in [Ca2+](i) at a high level of adrenaline was observed in 22 A degrees C-acclimated fish that may be related to an improved efficiency of adrenaline under these conditions. In conclusion, this method allows a rapid screening of cellular characteristics. It represents a promising tool to identify the cellular determinants of inter-individual variability in fishes' capacity for environmental adaptation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tese de Doutoramento, Ciências do Mar, da Terra e do Ambiente, Ramo: Ciências do Mar, Especialização em Ecologia Marinha, Faculdade de Ciências e Tecnologia, Universidade do Algarve, 2016

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During recent decades, the health of ocean ecosystems and fish populations has been threatened by overexploitation, pollution, and anthropogenic-driven climate change. Due to a lack of long-term data, we have a poor understanding of when intensive exploitation began and what impact anthropogenic activities have had on the ecology and evolution of fishes. Such information is crucial to recover degraded and depleted marine ecosystems and fish populations, maximise their productivity in-line with historical levels, and predict their future dynamics. In this thesis, I evaluate anthropogenic impacts on the iconic Atlantic bluefin tuna (Thunnus thynnus; BFT), one of the longest and recently most intensely exploited marine fishes, with a tremendous cultural and economic importance. Using a long-time series of archaeological and archived faunal remains (bones) dating back to approximately two millennia ago, I apply morphological, isotopic, and genomic techniques to perform the first studies on long-term BFT size and growth, diet and habitat use, and demography and adaptation, and produce the first genome-wide data on this species. My findings suggest that exploitation had impacted BFT foraging behaviour by the ~16th century when coastal ecosystem degradation induced a pelagic shift in diet and habitat use. I reveal that BFT biomass began to decline much earlier than hitherto documented, by the 19th century, consistent with intensive tuna trap catches during this period and catch-at-size increasing. I find that BFT juvenile growth had increased by the early 1900s (and more dramatically by the 21st century) which may reflect an evolutionary response to size selective harvest–which I find putative genomic signatures of. Further, I observed that BFT foraging behaviours have been modified following overexploitation during the 20th century, which previously included a isotopically distinct, Black Sea niche. Finally, I show that despite biomass declining from centuries ago, BFT has retained genomic diversity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The colors of 51 species of Hawaiian reef fish have been measured using a spectrometer and therefore can be described in objective terms that are not influenced by the human visual experience. In common with other known reef fish populations, the colors of Hawaiian reef fish occupy spectral positions from 300-800nm; yellow or orange with blue, yellow with black, and black with white are the most frequently combined colors; and there is no link between possession of ultraviolet (UV) reflectance and UV visual sensitivity or the potential for UV visual sensitivity. In contrast to other reef systems, blue, yellow, and orange appear more frequently in Hawaiian reef fish. Based on spectral quality of reflections from fish skin, trends in fish colors can be seen that are indicative of both visually driven selective pressures and chemical or physical constraints on the design of colors. UV-reflecting colors can function as semiprivate communication signals. White or yellow with black form highly contrasting patterns that transmit well through clear water. Labroid fishes display uniquely complex colors but lack the ability to see the UV component that is common in their pigments. Step-shaped spectral curves are usually long-wavelength colors such as yellow or red, and colors with a peak-shaped spectral curves are green, blue, violet, and UV.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Loss of connectivity in impounded rivers is among the impacts imposed by dams, and mitigation measures such as fish passages might not accomplish their purpose of reestablishing an efficient bi-directional gene flow in the fish populations affected. As a consequence, fish populations remain fragmented, and a new interpopulational structure may develop, with increased risk of reduced genetic diversity and stochastic extinction. In order to evaluate the effects of the Gavio Peixoto Dam, which was constructed almost a century ago on the Jacar,-Gua double dagger u River in the Upper Parana River basin, Brazil, a comparative morphometric study was undertaken on the populations of the Neotropical migratory characid fish Salminus hilarii living up- and downstream of this dam. Population dynamics, spatial segregation, and habitat use by different age classes were monitored for 2 years. We found that segregation caused by the dam and long periods with no efficient connection by fish passages have led to fragmentation and interpopulational structuring of S. hilarii, as revealed by canonical variable analysis of morphometric features. The fish populations occupying the up- and downstream sections have succeeded in performing short-distance reproductive migrations in the main river and tributaries, have found suitable habitats for completing their life cycle, and have been able to maintain distinct small-sized populations so far.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coronary artery disease (CAD) is currently one of the most prevalent diseases in the world population and calcium deposits in coronary arteries are one direct risk factor. These can be assessed by the calcium score (CS) application, available via a computed tomography (CT) scan, which gives an accurate indication of the development of the disease. However, the ionising radiation applied to patients is high. This study aimed to optimise the protocol acquisition in order to reduce the radiation dose and explain the flow of procedures to quantify CAD. The main differences in the clinical results, when automated or semiautomated post-processing is used, will be shown, and the epidemiology, imaging, risk factors and prognosis of the disease described. The software steps and the values that allow the risk of developingCADto be predicted will be presented. A64-row multidetector CT scan with dual source and two phantoms (pig hearts) were used to demonstrate the advantages and disadvantages of the Agatston method. The tube energy was balanced. Two measurements were obtained in each of the three experimental protocols (64, 128, 256 mAs). Considerable changes appeared between the values of CS relating to the protocol variation. The predefined standard protocol provided the lowest dose of radiation (0.43 mGy). This study found that the variation in the radiation dose between protocols, taking into consideration the dose control systems attached to the CT equipment and image quality, was not sufficient to justify changing the default protocol provided by the manufacturer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Estuaries are perhaps the most threatened environments in the coastal fringe; the coincidence of high natural value and attractiveness for human use has led to conflicts between conservation and development. These conflicts occur in the Sado Estuary since its location is near the industrialised zone of Peninsula of Setúbal and at the same time, a great part of the Estuary is classified as a Natural Reserve due to its high biodiversity. These facts led us to the need of implementing a model of environmental management and quality assessment, based on methodologies that enable the assessment of the Sado Estuary quality and evaluation of the human pressures in the estuary. These methodologies are based on indicators that can better depict the state of the environment and not necessarily all that could be measured or analysed. Sediments have always been considered as an important temporary source of some compounds or a sink for other type of materials or an interface where a great diversity of biogeochemical transformations occur. For all this they are of great importance in the formulation of coastal management system. Many authors have been using sediments to monitor aquatic contamination, showing great advantages when compared to the sampling of the traditional water column. The main objective of this thesis was to develop an estuary environmental management framework applied to Sado Estuary using the DPSIR Model (EMMSado), including data collection, data processing and data analysis. The support infrastructure of EMMSado were a set of spatially contiguous and homogeneous regions of sediment structure (management units). The environmental quality of the estuary was assessed through the sediment quality assessment and integrated in a preliminary stage with the human pressure for development. Besides the earlier explained advantages, studying the quality of the estuary mainly based on the indicators and indexes of the sediment compartment also turns this methodology easier, faster and human and financial resource saving. These are essential factors to an efficient environmental management of coastal areas. Data management, visualization, processing and analysis was obtained through the combined use of indicators and indices, sampling optimization techniques, Geographical Information Systems, remote sensing, statistics for spatial data, Global Positioning Systems and best expert judgments. As a global conclusion, from the nineteen management units delineated and analyzed three showed no ecological risk (18.5 % of the study area). The areas of more concern (5.6 % of the study area) are located in the North Channel and are under strong human pressure mainly due to industrial activities. These areas have also low hydrodynamics and are, thus associated with high levels of deposition. In particular the areas near Lisnave and Eurominas industries can also accumulate the contamination coming from Águas de Moura Channel, since particles coming from that channel can settle down in that area due to residual flow. In these areas the contaminants of concern, from those analyzed, are the heavy metals and metalloids (Cd, Cu, Zn and As exceeded the PEL guidelines) and the pesticides BHC isomers, heptachlor, isodrin, DDT and metabolits, endosulfan and endrin. In the remain management units (76 % of the study area) there is a moderate impact potential of occurrence of adverse ecological effects and in some of these areas no stress agents could be identified. This emphasizes the need for further research, since unmeasured chemicals may be causing or contributing to these adverse effects. Special attention must be taken to the units with moderate impact potential of occurrence of adverse ecological effects, located inside the natural reserve. Non-point source pollution coming from agriculture and aquaculture activities also seem to contribute with important pollution load into the estuary entering from Águas de Moura Channel. This pressure is expressed in a moderate impact potential for ecological risk existent in the areas near the entrance of this Channel. Pressures may also came from Alcácer Channel although they were not quantified in this study. The management framework presented here, including all the methodological tools may be applied and tested in other estuarine ecosystems, which will also allow a comparison between estuarine ecosystems in other parts of the globe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data analytic applications are characterized by large data sets that are subject to a series of processing phases. Some of these phases are executed sequentially but others can be executed concurrently or in parallel on clusters, grids or clouds. The MapReduce programming model has been applied to process large data sets in cluster and cloud environments. For developing an application using MapReduce there is a need to install/configure/access specific frameworks such as Apache Hadoop or Elastic MapReduce in Amazon Cloud. It would be desirable to provide more flexibility in adjusting such configurations according to the application characteristics. Furthermore the composition of the multiple phases of a data analytic application requires the specification of all the phases and their orchestration. The original MapReduce model and environment lacks flexible support for such configuration and composition. Recognizing that scientific workflows have been successfully applied to modeling complex applications, this paper describes our experiments on implementing MapReduce as subworkflows in the AWARD framework (Autonomic Workflow Activities Reconfigurable and Dynamic). A text mining data analytic application is modeled as a complex workflow with multiple phases, where individual workflow nodes support MapReduce computations. As in typical MapReduce environments, the end user only needs to define the application algorithms for input data processing and for the map and reduce functions. In the paper we present experimental results when using the AWARD framework to execute MapReduce workflows deployed over multiple Amazon EC2 (Elastic Compute Cloud) instances.