93 resultados para Data Warehousing Systems
Resumo:
The design of appropriate multifractal analysis algorithms, able to correctly characterize the scaling properties of multifractal systems from experimental, discretized data, is a major challenge in the study of such scale invariant systems. In the recent years, a growing interest for the application of the microcanonical formalism has taken place, as it allows a precise localization of the fractal components as well as a statistical characterization of the system. In this paper, we deal with the specific problems arising when systems that are strictly monofractal are analyzed using some standard microcanonical multifractal methods. We discuss the adaptations of these methods needed to give an appropriate treatment of monofractal systems.
Resumo:
We study the effects of the magnetic field on the relaxation of the magnetization of smallmonodomain noninteracting particles with random orientations and distribution of anisotropyconstants. Starting from a master equation, we build up an expression for the time dependence of themagnetization which takes into account thermal activation only over barriers separating energyminima, which, in our model, can be computed exactly from analytical expressions. Numericalcalculations of the relaxation curves for different distribution widths, and under different magneticfields H and temperatures T, have been performed. We show how a T ln(t/t0) scaling of the curves,at different T and for a given H, can be carried out after proper normalization of the data to theequilibrium magnetization. The resulting master curves are shown to be closely related to what wecall effective energy barrier distributions, which, in our model, can be computed exactly fromanalytical expressions. The concept of effective distribution serves us as a basis for finding a scalingvariable to scale relaxation curves at different H and a given T, thus showing that the fielddependence of energy barriers can be also extracted from relaxation measurements.
Resumo:
In the Hamiltonian formulation of predictive relativistic systems, the canonical coordinates cannot be the physical positions. The relation between them is given by the individuality differential equations. However, due to the arbitrariness in the choice of Cauchy data, there is a wide family of solutions for these equations. In general, those solutions do not satisfy the condition of constancy of velocities moduli, and therefore we have to reparametrize the world lines into the proper time. We derive here a condition on the Cauchy data for the individuality equations which ensures the constancy of the velocities moduli and makes the reparametrization unnecessary.
Resumo:
We critically discuss relaxation experiments in magnetic systems that can be characterized in terms of an energy barrier distribution, showing that proper normalization of the relaxation data is needed whenever curves corresponding to different temperatures are to be compared. We show how these normalization factors can be obtained from experimental data by using the Tln (t/t0) scaling method without making any assumptions about the nature of the energy barrier distribution. The validity of the procedure is tested using a ferrofluid of Fe3O4 particles.
Resumo:
The ability to entrap drugs within vehicles and subsequently release them has led to new treatments for a number of diseases. Based on an associative phase separation and interfacial diffusion approach, we developed a way to prepare DNA gel particles without adding any kind of cross-linker or organic solvent. Among the various agents studied, cationic surfactants offered particularly efficient control for encapsulation and DNA release from these DNA gel particles. The driving force for this strong association is the electrostatic interaction between the two components, as induced by the entropic increase due to the release of the respective counter-ions. However, little is known about the influence of the respective counter-ions on this surfactant-DNA interaction. Here we examined the effect of different counter-ions on the formation and properties of the DNA gel particles by mixing DNA (either single- (ssDNA) or double-stranded (dsDNA)) with the single chain surfactant dodecyltrimethylammonium (DTA). In particular, we used as counter-ions of this surfactant the hydrogen sulfate and trifluoromethane sulfonate anions and the two halides, chloride and bromide. Effects on the morphology of the particles obtained, the encapsulation of DNA and its release, as well as the haemocompatibility of these particles, are presented, using the counter-ion structure and the DNA conformation as controlling parameters. Analysis of the data indicates that the degree of counter-ion dissociation from the surfactant micelles and the polar/hydrophobic character of the counter-ion are important parameters in the final properties of the particles. The stronger interaction with amphiphiles for ssDNA than for dsDNA suggests the important role of hydrophobic interactions in DNA.
Resumo:
In October 1998, Hurricane Mitch triggered numerous landslides (mainly debris flows) in Honduras and Nicaragua, resulting in a high death toll and in considerable damage to property. The potential application of relatively simple and affordable spatial prediction models for landslide hazard mapping in developing countries was studied. Our attention was focused on a region in NW Nicaragua, one of the most severely hit places during the Mitch event. A landslide map was obtained at 1:10 000 scale in a Geographic Information System (GIS) environment from the interpretation of aerial photographs and detailed field work. In this map the terrain failure zones were distinguished from the areas within the reach of the mobilized materials. A Digital Elevation Model (DEM) with 20 m×20 m of pixel size was also employed in the study area. A comparative analysis of the terrain failures caused by Hurricane Mitch and a selection of 4 terrain factors extracted from the DEM which, contributed to the terrain instability, was carried out. Land propensity to failure was determined with the aid of a bivariate analysis and GIS tools in a terrain failure susceptibility map. In order to estimate the areas that could be affected by the path or deposition of the mobilized materials, we considered the fact that under intense rainfall events debris flows tend to travel long distances following the maximum slope and merging with the drainage network. Using the TauDEM extension for ArcGIS software we generated automatically flow lines following the maximum slope in the DEM starting from the areas prone to failure in the terrain failure susceptibility map. The areas crossed by the flow lines from each terrain failure susceptibility class correspond to the runout susceptibility classes represented in a runout susceptibility map. The study of terrain failure and runout susceptibility enabled us to obtain a spatial prediction for landslides, which could contribute to landslide risk mitigation.
Resumo:
This work focuses on the prediction of the two main nitrogenous variables that describe the water quality at the effluent of a Wastewater Treatment Plant. We have developed two kind of Neural Networks architectures based on considering only one output or, in the other hand, the usual five effluent variables that define the water quality: suspended solids, biochemical organic matter, chemical organic matter, total nitrogen and total Kjedhal nitrogen. Two learning techniques based on a classical adaptative gradient and a Kalman filter have been implemented. In order to try to improve generalization and performance we have selected variables by means genetic algorithms and fuzzy systems. The training, testing and validation sets show that the final networks are able to learn enough well the simulated available data specially for the total nitrogen
Resumo:
In this correspondence, we propose applying the hiddenMarkov models (HMM) theory to the problem of blind channel estimationand data detection. The Baum–Welch (BW) algorithm, which is able toestimate all the parameters of the model, is enriched by introducingsome linear constraints emerging from a linear FIR hypothesis on thechannel. Additionally, a version of the algorithm that is suitable for timevaryingchannels is also presented. Performance is analyzed in a GSMenvironment using standard test channels and is found to be close to thatobtained with a nonblind receiver.
Resumo:
This paper presents the first results of a current research project about human – environmental interactions in the Montseny Massif. Our work sets out to integrate two research lines in the studied area: - Archaeological and archaeo-morphological surveys in a lower part of the mountains in order to characterize the evolution of the settlements and field systems. - The geological and geomorphological characterization of the slope and terrace deposits in relation with field systems and archaeological data. First results point out the intensive occupation of these inland areas during the Iberian and the Roman periods. Post-Roman sediments show different processes of erosion.
Resumo:
Dissolved organic matter (DOM) is a complex mixture of organic compounds, ubiquitous in marine and freshwater systems. Fluorescence spectroscopy, by means of Excitation-Emission Matrices (EEM), has become an indispensable tool to study DOM sources, transport and fate in aquatic ecosystems. However the statistical treatment of large and heterogeneous EEM data sets still represents an important challenge for biogeochemists. Recently, Self-Organising Maps (SOM) has been proposed as a tool to explore patterns in large EEM data sets. SOM is a pattern recognition method which clusterizes and reduces the dimensionality of input EEMs without relying on any assumption about the data structure. In this paper, we show how SOM, coupled with a correlation analysis of the component planes, can be used both to explore patterns among samples, as well as to identify individual fluorescence components. We analysed a large and heterogeneous EEM data set, including samples from a river catchment collected under a range of hydrological conditions, along a 60-km downstream gradient, and under the influence of different degrees of anthropogenic impact. According to our results, chemical industry effluents appeared to have unique and distinctive spectral characteristics. On the other hand, river samples collected under flash flood conditions showed homogeneous EEM shapes. The correlation analysis of the component planes suggested the presence of four fluorescence components, consistent with DOM components previously described in the literature. A remarkable strength of this methodology was that outlier samples appeared naturally integrated in the analysis. We conclude that SOM coupled with a correlation analysis procedure is a promising tool for studying large and heterogeneous EEM data sets.
Resumo:
Gaia is the most ambitious space astrometry mission currently envisaged and is a technological challenge in all its aspects. We describe a proposal for the payload data handling system of Gaia, as an example of a high-performance, real-time, concurrent, and pipelined data system. This proposal includes the front-end systems for the instrumentation, the data acquisition and management modules, the star data processing modules, and the payload data handling unit. We also review other payload and service module elements and we illustrate a data flux proposal.
A priori parameterisation of the CERES soil-crop models and tests against several European data sets
Resumo:
Mechanistic soil-crop models have become indispensable tools to investigate the effect of management practices on the productivity or environmental impacts of arable crops. Ideally these models may claim to be universally applicable because they simulate the major processes governing the fate of inputs such as fertiliser nitrogen or pesticides. However, because they deal with complex systems and uncertain phenomena, site-specific calibration is usually a prerequisite to ensure their predictions are realistic. This statement implies that some experimental knowledge on the system to be simulated should be available prior to any modelling attempt, and raises a tremendous limitation to practical applications of models. Because the demand for more general simulation results is high, modellers have nevertheless taken the bold step of extrapolating a model tested within a limited sample of real conditions to a much larger domain. While methodological questions are often disregarded in this extrapolation process, they are specifically addressed in this paper, and in particular the issue of models a priori parameterisation. We thus implemented and tested a standard procedure to parameterize the soil components of a modified version of the CERES models. The procedure converts routinely-available soil properties into functional characteristics by means of pedo-transfer functions. The resulting predictions of soil water and nitrogen dynamics, as well as crop biomass, nitrogen content and leaf area index were compared to observations from trials conducted in five locations across Europe (southern Italy, northern Spain, northern France and northern Germany). In three cases, the model’s performance was judged acceptable when compared to experimental errors on the measurements, based on a test of the model’s root mean squared error (RMSE). Significant deviations between observations and model outputs were however noted in all sites, and could be ascribed to various model routines. In decreasing importance, these were: water balance, the turnover of soil organic matter, and crop N uptake. A better match to field observations could therefore be achieved by visually adjusting related parameters, such as field-capacity water content or the size of soil microbial biomass. As a result, model predictions fell within the measurement errors in all sites for most variables, and the model’s RMSE was within the range of published values for similar tests. We conclude that the proposed a priori method yields acceptable simulations with only a 50% probability, a figure which may be greatly increased through a posteriori calibration. Modellers should thus exercise caution when extrapolating their models to a large sample of pedo-climatic conditions for which they have only limited information.
Resumo:
Flood simulation studies use spatial-temporal rainfall data input into distributed hydrological models. A correct description of rainfall in space and in time contributes to improvements on hydrological modelling and design. This work is focused on the analysis of 2-D convective structures (rain cells), whose contribution is especially significant in most flood events. The objective of this paper is to provide statistical descriptors and distribution functions for convective structure characteristics of precipitation systems producing floods in Catalonia (NE Spain). To achieve this purpose heavy rainfall events recorded between 1996 and 2000 have been analysed. By means of weather radar, and applying 2-D radar algorithms a distinction between convective and stratiform precipitation is made. These data are introduced and analyzed with a GIS. In a first step different groups of connected pixels with convective precipitation are identified. Only convective structures with an area greater than 32 km2 are selected. Then, geometric characteristics (area, perimeter, orientation and dimensions of the ellipse), and rainfall statistics (maximum, mean, minimum, range, standard deviation, and sum) of these structures are obtained and stored in a database. Finally, descriptive statistics for selected characteristics are calculated and statistical distributions are fitted to the observed frequency distributions. Statistical analyses reveal that the Generalized Pareto distribution for the area and the Generalized Extreme Value distribution for the perimeter, dimensions, orientation and mean areal precipitation are the statistical distributions that best fit the observed ones of these parameters. The statistical descriptors and the probability distribution functions obtained are of direct use as an input in spatial rainfall generators.
Resumo:
Dissolved organic matter (DOM) is a complex mixture of organic compounds, ubiquitous in marine and freshwater systems. Fluorescence spectroscopy, by means of Excitation-Emission Matrices (EEM), has become an indispensable tool to study DOM sources, transport and fate in aquatic ecosystems. However the statistical treatment of large and heterogeneous EEM data sets still represents an important challenge for biogeochemists. Recently, Self-Organising Maps (SOM) has been proposed as a tool to explore patterns in large EEM data sets. SOM is a pattern recognition method which clusterizes and reduces the dimensionality of input EEMs without relying on any assumption about the data structure. In this paper, we show how SOM, coupled with a correlation analysis of the component planes, can be used both to explore patterns among samples, as well as to identify individual fluorescence components. We analysed a large and heterogeneous EEM data set, including samples from a river catchment collected under a range of hydrological conditions, along a 60-km downstream gradient, and under the influence of different degrees of anthropogenic impact. According to our results, chemical industry effluents appeared to have unique and distinctive spectral characteristics. On the other hand, river samples collected under flash flood conditions showed homogeneous EEM shapes. The correlation analysis of the component planes suggested the presence of four fluorescence components, consistent with DOM components previously described in the literature. A remarkable strength of this methodology was that outlier samples appeared naturally integrated in the analysis. We conclude that SOM coupled with a correlation analysis procedure is a promising tool for studying large and heterogeneous EEM data sets.