998 resultados para decomposition techniques


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of the thesis was to design and develop spatially adaptive denoising techniques with edge and feature preservation, for images corrupted with additive white Gaussian noise and SAR images affected with speckle noise. Image denoising is a well researched topic. It has found multifaceted applications in our day to day life. Image denoising based on multi resolution analysis using wavelet transform has received considerable attention in recent years. The directionlet based denoising schemes presented in this thesis are effective in preserving the image specific features like edges and contours in denoising. Scope of this research is still open in areas like further optimization in terms of speed and extension of the techniques to other related areas like colour and video image denoising. Such studies would further augment the practical use of these techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In ago-pastoral systems of the semi-arid West African Sahel, targeted applications of ruminant manure to the cropland is a widespread practice to maintain soil productivity. However, studies exploring the decomposition and mineralisation processes of manure under farmers' conditions are scarce. The present research in south-west Niger was undertaken to examine the role of micro-organisms and meso-fauna on in situ release rates of nitrogen (N), phosphorus (P) and potassium (K) from cattle and sheep-goat manure collected from village corrals during the rainy season. The results show tha (1) macro-organisms played a dominant role in the initial phase of manure decomposition; (2) manure decomposition was faster on crusted than on sandy soils; (3) throughout the study N and P release rates closely followed the dry matter decomposition; (4) during the first 6 weeks after application the K concentration in the manure declined much faster than N or P. At the applied dry matter rate of 18.8 Mg ha^-1, the quantities of N, P and K released from the manure during the rainy season were up to 10-fold larger than the annual nutrient uptake of pearl millet (Pennisetum glaucum L.), the dominant crop in the traditional agro-pastoral systems. The results indicate considerable nutrient losses with the scarce but heavy rainfalls which could be alleviated by smaller rates of manure application. Those, however, would require a more labour intensive system of corralling or manure distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Soil organic matter (SOM) vitally impacts all soil functions and plays a key role in the global carbon (C) cycle. More than 70% of the terrestric C stocks that participate in the active C cycle are stored in the soil. Therefore, quantitative knowledge of the rates of C incorporation into SOM fractions of different residence time is crucial to understand and predict the sequestration and stabilization of soil organic carbon (SOC). Consequently, there is a need of fractionation procedures that are capable of isolating functionally SOM fractions, i.e. fractions that are defined by their stability. The literature generally refers to three main mechanisms of SOM stabilization: protection of SOM from decomposition by (i) its structural composition, i.e. recalcitrance, (ii) spatial inaccessibility and/or (iii) interaction with soil minerals and metal ions. One of the difficulties in developing fractionation procedures for the isolation of functional SOM fractions is the marked heterogeneity of the soil environment with its various stabilization mechanisms – often several mechanisms operating simultaneously – in soils and soil horizons of different texture and mineralogy. The overall objective of the present thesis was to evaluate present fractionation techniques and to get a better understanding of the factors of SOM sequestration and stabilization. The first part of this study is attended to the structural composition of SOM. Using 13C cross-polarization magic-angle spinning (CPMAS) nuclear magnetic resonance (NMR) spectroscopy, (i) the effect of land use on SOM composition was investigated and (ii) examined whether SOM composition contributes to the different stability of SOM in density and aggregate fractions. The second part of the present work deals with the mineral-associated SOM fraction. The aim was (iii) to evaluate the suitability of chemical fractionation procedures used in the literature for the isolation of stable SOM pools (stepwise hydrolysis, treatments using oxidizing agents like Na2S2O8, H2O2, and NaOCl as well as demineralization of the residue obtained by the NaOCl treatment using HF (NaOCl+HF)) by pool sizes, 13C and 14C data. Further, (iv) the isolated SOM fractions were compared to the inert organic matter (IOM) pool obtained for the investigated soils using the Rothamsted Carbon Model and isotope data in order to see whether the tested chemical fractionation methods produce SOM fractions capable to represent this pool. Besides chemical fractionation, (v) the suitability of thermal oxidation at different temperatures for obtaining stable SOC pools was evaluated. Finally, (vi) the short-term aggregate dynamics and the factors that impact macroaggregate formation and C stabilization were investigated by means of an incubation study using treatments with and without application of 15N labeled maize straw of different degradability (leaves and coarse roots). All treatments were conducted with and without the addition of fungicide. Two study sites with different soil properties and land managements were chosen for these investigations. The first one, located at Rotthalmünster, is a Stagnic Luvisol (silty loam) under different land use regimes. The Ah horizons of a spruce forest and continuous grassland and the Ap and E horizons of two plots with arable crops (continuous maize and wheat cropping) were examined. The soil of the second study site, located at Halle, is a Haplic Phaeozem (loamy sand) where the Ap horizons of two plots with arable crops (continuous maize and rye cropping) were investigated. Both study sites had a C3-/C4-vegetational change on the maize plot for the purpose of tracing the incorporation of the younger, maize-derived C into different SOM fractions and the calculation of apparent C turnover times of these. The Halle site is located near a train station and industrial areas, which caused a contamination with high amounts of fossil C. The investigation of aggregate and density fractions by 13C CPMAS NMR spectroscopy revealed that density fractionation isolated SOM fractions of different composition. The consumption of a considerable part (10–20%) of the easily available O-alkyl-C and the selective preservation of the more recalcitrant alkyl-C when passing from litter to the different particulate organic matter (POM) fractions suggest that density fractionation was able to isolate SOM fractions with different degrees of decomposition. The spectra of the aggregate fractions resembled those of the mineral-associated SOM fraction obtained by density fractionation and no considerable differences were observed between aggregate size classes. Comparison of plant litter, density and aggregate size fractions from soil under different land use showed that the type of land use markedly influenced the composition of SOM. While SOM of the acid forest soil was characterized by a large content (> 50%) of POM, which contained high amounts of spruce-litter derived alkyl-C, the organic matter in the biologically more active grassland and arable soils was dominated by mineral-associated SOM (> 95%). This SOM fraction comprised greater proportions of aryl- and carbonyl-C and is considered to contain a higher amount of microbially-derived organic substances. Land use can alter both, structure and stability of SOM fractions. All applied chemical treatments induced considerable SOC losses (> 70–95% of mineral-associated SOM) in the investigated soils. The proportion of residual C after chemical fractionation was largest in the arable Ap and E horizons and increased with decreasing C content in the initial SOC after stepwise hydrolysis as well as after the oxidative treatments with H2O2 and Na2S2O8. This can be expected for a functional stable pool of SOM, because it is assumed that the more easily available part of SOC is consumed first if C inputs decrease. All chemical treatments led to a preferential loss of the younger, maize-derived SOC, but this was most pronounced after the treatments with Na2S2O8 and H2O2. After all chemical fractionations, the mean 14C ages of SOC were higher than in the mineral-associated SOM fraction for both study sites and increased in the order: NaOCl < NaOCl+HF ≤ stepwise hydrolysis << H2O2 ≈ Na2S2O8. The results suggest that all treatments were capable of isolating a more stable SOM fraction, but the treatments with H2O2 and Na2S2O8 were the most efficient ones. However, none of the chemical fractionation methods was able to fit the IOM pool calculated using the Rothamsted Carbon Model and isotope data. In the evaluation of thermal oxidation for obtaining stable C fractions, SOC losses increased with temperature from 24–48% (200°C) to 100% (500°C). In the Halle maize Ap horizon, losses of the young, maize-derived C were considerably higher than losses of the older C3-derived C, leading to an increase in the apparent C turnover time from 220 years in mineral-associated SOC to 1158 years after thermal oxidation at 300°C. Most likely, the preferential loss of maize-derived C in the Halle soil was caused by the presence of the high amounts of fossil C mentioned above, which make up a relatively large thermally stable C3-C pool in this soil. This agrees with lower overall SOC losses for the Halle Ap horizon compared to the Rotthalmünster Ap horizon. In the Rotthalmünster soil only slightly more maize-derived than C3-derived SOC was removed by thermal oxidation. Apparent C turnover times increased slightly from 58 years in mineral-associated SOC to 77 years after thermal oxidation at 300°C in the Rotthalmünster Ap and from 151 to 247 years in the Rotthalmünster E horizon. This led to the conclusion that thermal oxidation of SOM was not capable of isolating SOM fractions of considerably higher stability. The incubation experiment showed that macroaggregates develop rapidly after the addition of easily available plant residues. Within the first four weeks of incubation, the maximum aggregation was reached in all treatments without addition of fungicide. The formation of water-stable macroaggregates was related to the size of the microbial biomass pool and its activity. Furthermore, fungi were found to be crucial for the development of soil macroaggregates as the formation of water-stable macroaggregates was significantly delayed in the fungicide treated soils. The C concentration in the obtained aggregate fractions decreased with decreasing aggregate size class, which is in line with the aggregate hierarchy postulated by several authors for soils with SOM as the major binding agent. Macroaggregation involved incorporation of large amounts maize-derived organic matter, but macroaggregates did not play the most important role in the stabilization of maize-derived SOM, because of their relatively low amount (less than 10% of the soil mass). Furthermore, the maize-derived organic matter was quickly incorporated into all aggregate size classes. The microaggregate fraction stored the largest quantities of maize-derived C and N – up to 70% of the residual maize-C and -N were stored in this fraction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a new algorithm called TITANIC for computing concept lattices. It is based on data mining techniques for computing frequent itemsets. The algorithm is experimentally evaluated and compared with B. Ganter's Next-Closure algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Die zunehmende Vernetzung der Informations- und Kommunikationssysteme führt zu einer weiteren Erhöhung der Komplexität und damit auch zu einer weiteren Zunahme von Sicherheitslücken. Klassische Schutzmechanismen wie Firewall-Systeme und Anti-Malware-Lösungen bieten schon lange keinen Schutz mehr vor Eindringversuchen in IT-Infrastrukturen. Als ein sehr wirkungsvolles Instrument zum Schutz gegenüber Cyber-Attacken haben sich hierbei die Intrusion Detection Systeme (IDS) etabliert. Solche Systeme sammeln und analysieren Informationen von Netzwerkkomponenten und Rechnern, um ungewöhnliches Verhalten und Sicherheitsverletzungen automatisiert festzustellen. Während signatur-basierte Ansätze nur bereits bekannte Angriffsmuster detektieren können, sind anomalie-basierte IDS auch in der Lage, neue bisher unbekannte Angriffe (Zero-Day-Attacks) frühzeitig zu erkennen. Das Kernproblem von Intrusion Detection Systeme besteht jedoch in der optimalen Verarbeitung der gewaltigen Netzdaten und der Entwicklung eines in Echtzeit arbeitenden adaptiven Erkennungsmodells. Um diese Herausforderungen lösen zu können, stellt diese Dissertation ein Framework bereit, das aus zwei Hauptteilen besteht. Der erste Teil, OptiFilter genannt, verwendet ein dynamisches "Queuing Concept", um die zahlreich anfallenden Netzdaten weiter zu verarbeiten, baut fortlaufend Netzverbindungen auf, und exportiert strukturierte Input-Daten für das IDS. Den zweiten Teil stellt ein adaptiver Klassifikator dar, der ein Klassifikator-Modell basierend auf "Enhanced Growing Hierarchical Self Organizing Map" (EGHSOM), ein Modell für Netzwerk Normalzustand (NNB) und ein "Update Model" umfasst. In dem OptiFilter werden Tcpdump und SNMP traps benutzt, um die Netzwerkpakete und Hostereignisse fortlaufend zu aggregieren. Diese aggregierten Netzwerkpackete und Hostereignisse werden weiter analysiert und in Verbindungsvektoren umgewandelt. Zur Verbesserung der Erkennungsrate des adaptiven Klassifikators wird das künstliche neuronale Netz GHSOM intensiv untersucht und wesentlich weiterentwickelt. In dieser Dissertation werden unterschiedliche Ansätze vorgeschlagen und diskutiert. So wird eine classification-confidence margin threshold definiert, um die unbekannten bösartigen Verbindungen aufzudecken, die Stabilität der Wachstumstopologie durch neuartige Ansätze für die Initialisierung der Gewichtvektoren und durch die Stärkung der Winner Neuronen erhöht, und ein selbst-adaptives Verfahren eingeführt, um das Modell ständig aktualisieren zu können. Darüber hinaus besteht die Hauptaufgabe des NNB-Modells in der weiteren Untersuchung der erkannten unbekannten Verbindungen von der EGHSOM und der Überprüfung, ob sie normal sind. Jedoch, ändern sich die Netzverkehrsdaten wegen des Concept drif Phänomens ständig, was in Echtzeit zur Erzeugung nicht stationärer Netzdaten führt. Dieses Phänomen wird von dem Update-Modell besser kontrolliert. Das EGHSOM-Modell kann die neuen Anomalien effektiv erkennen und das NNB-Model passt die Änderungen in Netzdaten optimal an. Bei den experimentellen Untersuchungen hat das Framework erfolgversprechende Ergebnisse gezeigt. Im ersten Experiment wurde das Framework in Offline-Betriebsmodus evaluiert. Der OptiFilter wurde mit offline-, synthetischen- und realistischen Daten ausgewertet. Der adaptive Klassifikator wurde mit dem 10-Fold Cross Validation Verfahren evaluiert, um dessen Genauigkeit abzuschätzen. Im zweiten Experiment wurde das Framework auf einer 1 bis 10 GB Netzwerkstrecke installiert und im Online-Betriebsmodus in Echtzeit ausgewertet. Der OptiFilter hat erfolgreich die gewaltige Menge von Netzdaten in die strukturierten Verbindungsvektoren umgewandelt und der adaptive Klassifikator hat sie präzise klassifiziert. Die Vergleichsstudie zwischen dem entwickelten Framework und anderen bekannten IDS-Ansätzen zeigt, dass der vorgeschlagene IDSFramework alle anderen Ansätze übertrifft. Dies lässt sich auf folgende Kernpunkte zurückführen: Bearbeitung der gesammelten Netzdaten, Erreichung der besten Performanz (wie die Gesamtgenauigkeit), Detektieren unbekannter Verbindungen und Entwicklung des in Echtzeit arbeitenden Erkennungsmodells von Eindringversuchen.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Unidad didáctica de Inglés elaborada a partir de un tema transversal: el racismo. Este tema crea un clima positivo de respeto y colaboración que facilita el trabajo en equipo. Se resalta el papel que la lengua extranjera tiene como instrumento de comunicación y de cooperación entre los distintos paises y pueblos. La unidad cubre las cuatro detrezas comunicativas: listening, speaking, reading y writing para los niveles superiores de la Enseñanza Secundaria, a través de materiales audiovisuales reales, no creados por el profesorado para la ocasión (publicación audiovisual Speak-up, revista TIME, etc.).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Signalling off-chip requires significant current. As a result, a chip's power-supply current changes drastically during certain output-bus transitions. These current fluctuations cause a voltage drop between the chip and circuit board due to the parasitic inductance of the power-supply package leads. Digital designers often go to great lengths to reduce this "transmitted" noise. Cray, for instance, carefully balances output signals using a technique called differential signalling to guarantee a chip has constant output current. Transmitted-noise reduction costs Cray a factor of two in output pins and wires. Coding achieves similar results at smaller costs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an image-based rendering system using algebraic relations between different views of an object. The system uses pictures of an object taken from known positions. Given three such images it can generate "virtual'' ones as the object would look from any position near the ones that the two input images were taken from. The extrapolation from the example images can be up to about 60 degrees of rotation. The system is based on the trilinear constraints that bind any three view so fan object. As a side result, we propose two new methods for camera calibration. We developed and used one of them. We implemented the system and tested it on real images of objects and faces. We also show experimentally that even when only two images taken from unknown positions are given, the system can be used to render the object from other view points as long as we have a good estimate of the internal parameters of the camera used and we are able to find good correspondence between the example images. In addition, we present the relation between these algebraic constraints and a factorization method for shape and motion estimation. As a result we propose a method for motion estimation in the special case of orthographic projection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main instrument used in psychological measurement is the self-report questionnaire. One of its major drawbacks however is its susceptibility to response biases. A known strategy to control these biases has been the use of so-called ipsative items. Ipsative items are items that require the respondent to make between-scale comparisons within each item. The selected option determines to which scale the weight of the answer is attributed. Consequently in questionnaires only consisting of ipsative items every respondent is allotted an equal amount, i.e. the total score, that each can distribute differently over the scales. Therefore this type of response format yields data that can be considered compositional from its inception. Methodological oriented psychologists have heavily criticized this type of item format, since the resulting data is also marked by the associated unfavourable statistical properties. Nevertheless, clinicians have kept using these questionnaires to their satisfaction. This investigation therefore aims to evaluate both positions and addresses the similarities and differences between the two data collection methods. The ultimate objective is to formulate a guideline when to use which type of item format. The comparison is based on data obtained with both an ipsative and normative version of three psychological questionnaires, which were administered to 502 first-year students in psychology according to a balanced within-subjects design. Previous research only compared the direct ipsative scale scores with the derived ipsative scale scores. The use of compositional data analysis techniques also enables one to compare derived normative score ratios with direct normative score ratios. The addition of the second comparison not only offers the advantage of a better-balanced research strategy. In principle it also allows for parametric testing in the evaluation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 2000 the European Statistical Office published the guidelines for developing the Harmonized European Time Use Surveys system. Under such a unified framework, the first Time Use Survey of national scope was conducted in Spain during 2002– 03. The aim of these surveys is to understand human behavior and the lifestyle of people. Time allocation data are of compositional nature in origin, that is, they are subject to non-negativity and constant-sum constraints. Thus, standard multivariate techniques cannot be directly applied to analyze them. The goal of this work is to identify homogeneous Spanish Autonomous Communities with regard to the typical activity pattern of their respective populations. To this end, fuzzy clustering approach is followed. Rather than the hard partitioning of classical clustering, where objects are allocated to only a single group, fuzzy method identify overlapping groups of objects by allowing them to belong to more than one group. Concretely, the probabilistic fuzzy c-means algorithm is conveniently adapted to deal with the Spanish Time Use Survey microdata. As a result, a map distinguishing Autonomous Communities with similar activity pattern is drawn. Key words: Time use data, Fuzzy clustering; FCM; simplex space; Aitchison distance

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In order to obtain a high-resolution Pleistocene stratigraphy, eleven continuously cored boreholes, 100 to 220m deep were drilled in the northern part of the Po Plain by Regione Lombardia in the last five years. Quantitative provenance analysis (QPA, Weltje and von Eynatten, 2004) of Pleistocene sands was carried out by using multivariate statistical analysis (principal component analysis, PCA, and similarity analysis) on an integrated data set, including high-resolution bulk petrography and heavy-mineral analyses on Pleistocene sands and of 250 major and minor modern rivers draining the southern flank of the Alps from West to East (Garzanti et al, 2004; 2006). Prior to the onset of major Alpine glaciations, metamorphic and quartzofeldspathic detritus from the Western and Central Alps was carried from the axial belt to the Po basin longitudinally parallel to the SouthAlpine belt by a trunk river (Vezzoli and Garzanti, 2008). This scenario rapidly changed during the marine isotope stage 22 (0.87 Ma), with the onset of the first major Pleistocene glaciation in the Alps (Muttoni et al, 2003). PCA and similarity analysis from core samples show that the longitudinal trunk river at this time was shifted southward by the rapid southward and westward progradation of transverse alluvial river systems fed from the Central and Southern Alps. Sediments were transported southward by braided river systems as well as glacial sediments transported by Alpine valley glaciers invaded the alluvial plain. Kew words: Detrital modes; Modern sands; Provenance; Principal Components Analysis; Similarity, Canberra Distance; palaeodrainage

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Often practical performance of analytical redundancy for fault detection and diagnosis is decreased by uncertainties prevailing not only in the system model, but also in the measurements. In this paper, the problem of fault detection is stated as a constraint satisfaction problem over continuous domains with a big number of variables and constraints. This problem can be solved using modal interval analysis and consistency techniques. Consistency techniques are then shown to be particularly efficient to check the consistency of the analytical redundancy relations (ARRs), dealing with uncertain measurements and parameters. Through the work presented in this paper, it can be observed that consistency techniques can be used to increase the performance of a robust fault detection tool, which is based on interval arithmetic. The proposed method is illustrated using a nonlinear dynamic model of a hydraulic system