973 resultados para Recurrence quantification analysis
Resumo:
In September 1999 two short-term moorings with cylindrical sediment traps were deployed to collect sinking particles in bottom waters off the Ob and Yenisei river mouths. Samples were studied for their bulk composition, pigments, phytoplankton, microzooplankton, fecal material, amino acids, hexosamines, fatty acids and sterols and compared to suspended matter and surface sediments in order to collect information about the nature and cycling of particulate matter in the water column. Results of all measured components in sinking particles point to an ongoing seasonality in the pelagic system from blooming diatoms in the first phase to a more retention system in the second half of trap deployment. Due to a phytoplankton bloom observed north of the Ob estuary, flux rates were generally higher in the trap deployed off the Ob than off the Yenisei. The Ob trap collected fresh surface-derived particulate matter. Particles from the Yenisei trap were more degraded and resembled deep water suspension. This material may partly have been derived from resuspended sediments.
Resumo:
Heterozoan carbonates are typical for extratropical sedimentary systems. However, under mesotrophic to eutrophic conditions, heterozoan carbonates also form in tropical settings. Nevertheless, such heterozoan tropical sedimentary systems are rare in the modern world and therefore are only poorly understood to date. Here a carbonate depositional system is presented where nutrient-rich upwelling waters push onto a wide shelf. These waters warm up in the shelf, giving rise to the production and deposition of tropical heterozoan facies. The carbonate facies on this shelf are characterized by a mixture of tropical and cosmopolitan biogenic sedimentary grains. Study of facies and taxonomy are the key for identifying and characterizing tropical heterozoan carbonates and for distinguishing them from their coolwater counterparts, in particular in the past where the oceanography cannot be determined directly.
Resumo:
Sediments in Arctic sea ice are important for erosion and redistribution and consequently a factor for the sediment budget of the Arctic Ocean. The processes leading to the incorporation of sediments into the ice are not understood in detail yet. In the present study, experiments on the incorporation of sediments were therefore conducted in ice tanks of The Hamburg Ship Model Basin (HSVA) in winter 1996/1997, These experiments showed that on average 75 % of the artificial sea-ice sediments were located in the brine-channel system. The sediments were scavenged from the water column by frazil ice. Sediments functioning as a nucleus for the formation of frazil ice were less important for the incorporation. Filtration in grease ice during relatively calm hydrodynamic conditions was probably an effective process to enrich sediments in the ice. Wave fields did not play an important role for the incorporation of sediments into the artificial sea ice. During the expedition TRANSDRIFT III (TDIII, October 1995), different types of natural, newly-formed sea ice (grease ice, nilas and young ice) were sampled in the inner Laptev Sea at the time of freeze-up. The incorporation of sediments took place during calm meteorological conditions then. The characteristics of the clay mineral assemblages of these sedirnents served as references for sea-ice sediments which were sampled from first-year drift ice in the outer Laptev Sea and the adjacent Arctic Ocean during the POLARSTERN expedition ARK-XI/1 (July-September 1995). Based on the clay mineral assemblages, probable incorporation areas for the sedirnents in first-year drift ice could be statistically reconstructed in the inner Laptev Sea (eastern, central, and Western Laptev Sea) as well as in adjacent regions. Comparing the amounts of particulate organic carbon (POC) in sea-ice sediments and in surface sediments from the shelves of potential incorporation areas often reveals higher values in sea-ice sediments (TDIII: 3.6 %DM; ARK-XI/1: 2.3 %DM). This enrichment of POC is probably due to the incorporation process into the sea ice, as could be deducted from maceral analysis and Rock-Eval pyrolysis. Both methods were applied in the present study to particulate organic material (POM) from sea-ice sediments for the first time. It was shown that the POM of the sea-ice sediments from the Laptev Sea and the adjacent Arctic Ocean was dominated by reworked, strongly fragmented, allochthonous (terrigenous) material. This terrigenous component accounted for more than 75 % of all counted macerals. The autochthonous (marine) component was also strongly fragmented, and higher in the sediments from newly-formed sea ice (24 % of all counted macerals) as compared to first-year drift ice (17 % of all counted macerals). Average hydroge indices confirmed this pattern and were in the transition zone between kerogen types II and III (TDIII: 275 mg KW/g POC; ARK-XI/1: 200 mg KW/g POC). The sediment loads quantified in natural sea ice (TDIII: 33.6 mg/l, ARK-XI/1: 49.0 mg/l) indicated that sea-ice sediments are an important factor for the sediment budget in the Laptev Sea. In particular during the incorporation phase in autumn and early winter, about 12 % of the sediment load imported annually by rivers into the Laptev Sea can be incorporated into sea ice and redistributed during calm meteorological conditions. Single entrainment events can incorporate about 35 % of the river input into the sea ice (ca. 9 x 10**6 t) and export it via the Transpolar Drift from the Eurasian shelf to the Fram Strait.
Resumo:
Soil voids manifest the cumulative effect of local pedogenic processes and ultimately influence soil behavior - especially as it pertains to aeration and hydrophysical properties. Because of the relatively weak attenuation of X-rays by air, compared with liquids or solids, non-disruptive CT scanning has become a very attractive tool for generating three-dimensional imagery of soil voids. One of the main steps involved in this analysis is the thresholding required to transform the original (greyscale) images into the type of binary representation (e.g., pores in white, solids in black) needed for fractal analysis or simulation with Lattice?Boltzmann models (Baveye et al., 2010). The objective of the current work is to apply an innovative approach to quantifying soil voids and pore networks in original X-ray CT imagery using Relative Entropy (Bird et al., 2006; Tarquis et al., 2008). These will be illustrated using typical imagery representing contrasting soil structures. Particular attention will be given to the need to consider the full 3D context of the CT imagery, as well as scaling issues, in the application and interpretation of this index.
Resumo:
The research in this thesis is related to static cost and termination analysis. Cost analysis aims at estimating the amount of resources that a given program consumes during the execution, and termination analysis aims at proving that the execution of a given program will eventually terminate. These analyses are strongly related, indeed cost analysis techniques heavily rely on techniques developed for termination analysis. Precision, scalability, and applicability are essential in static analysis in general. Precision is related to the quality of the inferred results, scalability to the size of programs that can be analyzed, and applicability to the class of programs that can be handled by the analysis (independently from precision and scalability issues). This thesis addresses these aspects in the context of cost and termination analysis, from both practical and theoretical perspectives. For cost analysis, we concentrate on the problem of solving cost relations (a form of recurrence relations) into closed-form upper and lower bounds, which is the heart of most modern cost analyzers, and also where most of the precision and applicability limitations can be found. We develop tools, and their underlying theoretical foundations, for solving cost relations that overcome the limitations of existing approaches, and demonstrate superiority in both precision and applicability. A unique feature of our techniques is the ability to smoothly handle both lower and upper bounds, by reversing the corresponding notions in the underlying theory. For termination analysis, we study the hardness of the problem of deciding termination for a speci�c form of simple loops that arise in the context of cost analysis. This study gives a better understanding of the (theoretical) limits of scalability and applicability for both termination and cost analysis.
Resumo:
Background: There are 600,000 new malaria cases daily worldwide. The gold standard for estimating the parasite burden and the corresponding severity of the disease consists in manually counting the number of parasites in blood smears through a microscope, a process that can take more than 20 minutes of an expert microscopist’s time. Objective: This research tests the feasibility of a crowdsourced approach to malaria image analysis. In particular, we investigated whether anonymous volunteers with no prior experience would be able to count malaria parasites in digitized images of thick blood smears by playing a Web-based game. Methods: The experimental system consisted of a Web-based game where online volunteers were tasked with detecting parasites in digitized blood sample images coupled with a decision algorithm that combined the analyses from several players to produce an improved collective detection outcome. Data were collected through the MalariaSpot website. Random images of thick blood films containing Plasmodium falciparum at medium to low parasitemias, acquired by conventional optical microscopy, were presented to players. In the game, players had to find and tag as many parasites as possible in 1 minute. In the event that players found all the parasites present in the image, they were presented with a new image. In order to combine the choices of different players into a single crowd decision, we implemented an image processing pipeline and a quorum algorithm that judged a parasite tagged when a group of players agreed on its position. Results: Over 1 month, anonymous players from 95 countries played more than 12,000 games and generated a database of more than 270,000 clicks on the test images. Results revealed that combining 22 games from nonexpert players achieved a parasite counting accuracy higher than 99%. This performance could be obtained also by combining 13 games from players trained for 1 minute. Exhaustive computations measured the parasite counting accuracy for all players as a function of the number of games considered and the experience of the players. In addition, we propose a mathematical equation that accurately models the collective parasite counting performance. Conclusions: This research validates the online gaming approach for crowdsourced counting of malaria parasites in images of thick blood films. The findings support the conclusion that nonexperts are able to rapidly learn how to identify the typical features of malaria parasites in digitized thick blood samples and that combining the analyses of several users provides similar parasite counting accuracy rates as those of expert microscopists. This experiment illustrates the potential of the crowdsourced gaming approach for performing routine malaria parasite quantification, and more generally for solving biomedical image analysis problems, with future potential for telediagnosis related to global health challenges.
Resumo:
Stochastic model updating must be considered for quantifying uncertainties inherently existing in real-world engineering structures. By this means the statistical properties,instead of deterministic values, of structural parameters can be sought indicating the parameter variability. However, the implementation of stochastic model updating is much more complicated than that of deterministic methods particularly in the aspects of theoretical complexity and low computational efficiency. This study attempts to propose a simple and cost-efficient method by decomposing a stochastic updating process into a series of deterministic ones with the aid of response surface models and Monte Carlo simulation. The response surface models are used as surrogates for original FE models in the interest of programming simplification, fast response computation and easy inverse optimization. Monte Carlo simulation is adopted for generating samples from the assumed or measured probability distributions of responses. Each sample corresponds to an individual deterministic inverse process predicting the deterministic values of parameters. Then the parameter means and variances can be statistically estimated based on all the parameter predictions by running all the samples. Meanwhile, the analysis of variance approach is employed for the evaluation of parameter variability significance. The proposed method has been demonstrated firstly on a numerical beam and then a set of nominally identical steel plates tested in the laboratory. It is found that compared with the existing stochastic model updating methods, the proposed method presents similar accuracy while its primary merits consist in its simple implementation and cost efficiency in response computation and inverse optimization.
Resumo:
Process mineralogy provides the mineralogical information required by geometallurgists to address the inherent variation of geological data. The successful benefitiation of ores mostly depends on the ability of mineral processing to be efficiently adapted to the ore characteristics, being liberation one of the most relevant mineralogical parameters. The liberation characteristics of ores are intimately related to mineral texture. Therefore, the characterization of liberation necessarily requieres the identification and quantification of those textural features with a major bearing on mineral liberation. From this point of view grain size, bonding between mineral grains and intergrowth types are considered as the most influential textural attributes. While the quantification of grain size is a usual output of automated current technologies, information about grain boundaries and intergrowth types is usually descriptive and difficult to quantify to be included in the geometallurgical model. Aiming at the systematic and quantitative analysis of the intergrowth type within mineral particles, a new methodology based on digital image analysis has been developed. In this work, the ability of this methodology to achieve a more complete characterization of liberation is explored by the analysis of chalcopyrite in the rougher concentrate of the Kansanshi copper-gold mine (Zambia). Results obtained show that the method provides valuable textural information to achieve a better understanding of mineral behaviour during concentration processes. The potential of this method is enhanced by the fact that it provides data unavailable by current technologies. This opens up new perspectives on the quantitative analysis of mineral processing performance based on textural attributes.
Resumo:
In this paper we present a global overview of the recent study carried out in Spain for the new hazard map, which final goal is the revision of the Building Code in our country (NCSE-02). The study was carried our for a working group joining experts from The Instituto Geografico Nacional (IGN) and the Technical University of Madrid (UPM) , being the different phases of the work supervised by an expert Committee integrated by national experts from public institutions involved in subject of seismic hazard. The PSHA method (Probabilistic Seismic Hazard Assessment) has been followed, quantifying the epistemic uncertainties through a logic tree and the aleatory ones linked to variability of parameters by means of probability density functions and Monte Carlo simulations. In a first phase, the inputs have been prepared, which essentially are: 1) a project catalogue update and homogenization at Mw 2) proposal of zoning models and source characterization 3) calibration of Ground Motion Prediction Equations (GMPE’s) with actual data and development of a local model with data collected in Spain for Mw < 5.5. In a second phase, a sensitivity analysis of the different input options on hazard results has been carried out in order to have criteria for defining the branches of the logic tree and their weights. Finally, the hazard estimation was done with the logic tree shown in figure 1, including nodes for quantifying uncertainties corresponding to: 1) method for estimation of hazard (zoning and zoneless); 2) zoning models, 3) GMPE combinations used and 4) regression method for estimation of source parameters. In addition, the aleatory uncertainties corresponding to the magnitude of the events, recurrence parameters and maximum magnitude for each zone have been also considered including probability density functions and Monte Carlo simulations The main conclusions of the study are presented here, together with the obtained results in terms of PGA and other spectral accelerations SA (T) for return periods of 475, 975 and 2475 years. The map of the coefficient of variation (COV) are also represented to give an idea of the zones where the dispersion among results are the highest and the zones where the results are robust.