875 resultados para ENVIRONMENT DATA
Resumo:
In the present study we are using multi variate analysis techniques to discriminate signal from background in the fully hadronic decay channel of ttbar events. We give a brief introduction to the role of the Top quark in the standard model and a general description of the CMS Experiment at LHC. We have used the CMS experiment computing and software infrastructure to generate and prepare the data samples used in this analysis. We tested the performance of three different classifiers applied to our data samples and used the selection obtained with the Multi Layer Perceptron classifier to give an estimation of the statistical and systematical uncertainty on the cross section measurement.
Resumo:
The presented study carried out an analysis on rural landscape changes. In particular the study focuses on the understanding of driving forces acting on the rural built environment using a statistical spatial model implemented through GIS techniques. It is well known that the study of landscape changes is essential for a conscious decision making in land planning. From a bibliography review results a general lack of studies dealing with the modeling of rural built environment and hence a theoretical modelling approach for such purpose is needed. The advancement in technology and modernity in building construction and agriculture have gradually changed the rural built environment. In addition, the phenomenon of urbanization of a determined the construction of new volumes that occurred beside abandoned or derelict rural buildings. Consequently there are two types of transformation dynamics affecting mainly the rural built environment that can be observed: the conversion of rural buildings and the increasing of building numbers. It is the specific aim of the presented study to propose a methodology for the development of a spatial model that allows the identification of driving forces that acted on the behaviours of the building allocation. In fact one of the most concerning dynamic nowadays is related to an irrational expansion of buildings sprawl across landscape. The proposed methodology is composed by some conceptual steps that cover different aspects related to the development of a spatial model: the selection of a response variable that better describe the phenomenon under study, the identification of possible driving forces, the sampling methodology concerning the collection of data, the most suitable algorithm to be adopted in relation to statistical theory and method used, the calibration process and evaluation of the model. A different combination of factors in various parts of the territory generated favourable or less favourable conditions for the building allocation and the existence of buildings represents the evidence of such optimum. Conversely the absence of buildings expresses a combination of agents which is not suitable for building allocation. Presence or absence of buildings can be adopted as indicators of such driving conditions, since they represent the expression of the action of driving forces in the land suitability sorting process. The existence of correlation between site selection and hypothetical driving forces, evaluated by means of modeling techniques, provides an evidence of which driving forces are involved in the allocation dynamic and an insight on their level of influence into the process. GIS software by means of spatial analysis tools allows to associate the concept of presence and absence with point futures generating a point process. Presence or absence of buildings at some site locations represent the expression of these driving factors interaction. In case of presences, points represent locations of real existing buildings, conversely absences represent locations were buildings are not existent and so they are generated by a stochastic mechanism. Possible driving forces are selected and the existence of a causal relationship with building allocations is assessed through a spatial model. The adoption of empirical statistical models provides a mechanism for the explanatory variable analysis and for the identification of key driving variables behind the site selection process for new building allocation. The model developed by following the methodology is applied to a case study to test the validity of the methodology. In particular the study area for the testing of the methodology is represented by the New District of Imola characterized by a prevailing agricultural production vocation and were transformation dynamic intensively occurred. The development of the model involved the identification of predictive variables (related to geomorphologic, socio-economic, structural and infrastructural systems of landscape) capable of representing the driving forces responsible for landscape changes.. The calibration of the model is carried out referring to spatial data regarding the periurban and rural area of the study area within the 1975-2005 time period by means of Generalised linear model. The resulting output from the model fit is continuous grid surface where cells assume values ranged from 0 to 1 of probability of building occurrences along the rural and periurban area of the study area. Hence the response variable assesses the changes in the rural built environment occurred in such time interval and is correlated to the selected explanatory variables by means of a generalized linear model using logistic regression. Comparing the probability map obtained from the model to the actual rural building distribution in 2005, the interpretation capability of the model can be evaluated. The proposed model can be also applied to the interpretation of trends which occurred in other study areas, and also referring to different time intervals, depending on the availability of data. The use of suitable data in terms of time, information, and spatial resolution and the costs related to data acquisition, pre-processing, and survey are among the most critical aspects of model implementation. Future in-depth studies can focus on using the proposed model to predict short/medium-range future scenarios for the rural built environment distribution in the study area. In order to predict future scenarios it is necessary to assume that the driving forces do not change and that their levels of influence within the model are not far from those assessed for the time interval used for the calibration.
Resumo:
Smart Environments are currently considered a key factor to connect the physical world with the information world. A Smart Environment can be defined as the combination of a physical environment, an infrastructure for data management (called Smart Space), a collection of embedded systems gathering heterogeneous data from the environment and a connectivity solution to convey these data to the Smart Space. With this vision, any application which takes advantages from the environment could be devised, without the need to directly access to it, since all information are stored in the Smart Space in a interoperable format. Moreover, according to this vision, for each entity populating the physical environment, i.e. users, objects, devices, environments, the following questions can be arise: “Who?”, i.e. which are the entities that should be identified? “Where?” i.e. where are such entities located in physical space? and “What?” i.e. which attributes and properties of the entities should be stored in the Smart Space in machine understandable format, in the sense that its meaning has to be explicitly defined and all the data should be linked together in order to be automatically retrieved by interoperable applications. Starting from this the location detection is a necessary step in the creation of Smart Environments. If the addressed entity is a user and the environment a generic environment, a meaningful way to assign the position, is through a Pedestrian Tracking System. In this work two solution for these type of system are proposed and compared. One of the two solution has been studied and developed in all its aspects during the doctoral period. The work also investigates the problem to create and manage the Smart Environment. The proposed solution is to create, by means of natural interactions, links between objects and between objects and their environment, through the use of specific devices, i.e. Smart Objects
Resumo:
This PhD thesis concerns geochemical constraints on recycling and partial melting of Archean continental crust. A natural example of such processes was found in the Iisalmi area of Central Finland. The rocks from this area are Middle to Late Archean in age and experienced metamorphism and partial melting between 2.7-2.63 Ga. The work is based on extensive field work. It is furthermore founded on bulk rock geochemical data as well as in-situ analyses of minerals. All geochemical data were obtained at the Institute of Geosciences, University of Mainz using X-ray fluorescence, solution ICP-MS and laser ablation-ICP-MS for bulk rock geochemical analyses. Mineral analyses were accomplished by electron microprobe and laser ablation ICP-MS. Fluid inclusions were studied by microscope on a heating-freezing-stage at the Geoscience Center, University Göttingen. Part I focuses on the development of a new analytical method for bulk rock trace element determination by laser ablation-ICP-MS using homogeneous glasses fused from rock powder on an Iridium strip heater. This method is applicable for mafic rock samples whose melts have low viscosities and homogenize quickly at temperatures of ~1200°C. Highly viscous melts of felsic samples prevent melting and homogenization at comparable temperatures. Fusion of felsic samples can be enabled by addition of MgO to the rock powder and adjustment of melting temperature and melting duration to the rock composition. Advantages of the fusion method are low detection limits compared to XRF analyses and avoidance of wet-chemical processing and use of strong acids as in solution ICP-MS as well as smaller sample volumes compared to the other methods. Part II of the thesis uses bulk rock geochemical data and results from fluid inclusion studies for discrimination of melting processes observed in different rock types. Fluid inclusion studies demonstrate a major change in fluid composition from CO2-dominated fluids in granulites to aqueous fluids in TTG gneisses and amphibolites. Partial melts were generated in the dry, CO2-rich environment by dehydration melting reactions of amphibole which in addition to tonalitic melts produced the anhydrous mineral assemblages of granulites (grt + cpx + pl ± amph or opx + cpx + pl + amph). Trace element modeling showed that mafic granulites are residues of 10-30 % melt extraction from amphibolitic precursor rocks. The maximum degree of melting in intermediate granulites was ~10 % as inferred from modal abundances of amphibole, clinopyroxene and orthopyroxene. Carbonic inclusions are absent in upper-amphibolite facies migmatites whereas aqueous inclusion with up to 20 wt% NaCl are abundant. This suggests that melting within TTG gneisses and amphibolites took place in the presence of an aqueous fluid phase that enabled melting at the wet solidus at temperatures of 700-750°C. The strong disruption of pre-metamorphic structures in some outcrops suggests that the maximum amount of melt in TTG gneisses was ~25 vol%. The presence of leucosomes in all rock types is taken as the principle evidence for melt formation. However, mineralogical appearance as well as major and trace element composition of many leucosomes imply that leucosomes seldom represent frozen in-situ melts. They are better considered as remnants of the melt channel network, e.g. ways on which melts escaped from the system. Part III of the thesis describes how analyses of minerals from a specific rock type (granulite) can be used to determine partition coefficients between different minerals and between minerals and melt suitable for lower crustal conditions. The trace element analyses by laser ablation-ICP-MS show coherent distribution among the principal mineral phases independent of rock composition. REE contents in amphibole are about 3 times higher than REE contents in clinopyroxene from the same sample. This consistency has to be taken into consideration in models of lower crustal melting where amphibole is replaced by clinopyroxene in the course of melting. A lack of equilibrium is observed between matrix clinopyroxene / amphibole and garnet porphyroblasts which suggests a late stage growth of garnet and slow diffusion and equilibration of the REE during metamorphism. The data provide a first set of distribution coefficients of the transition metals (Sc, V, Cr, Ni) in the lower crust. In addition, analyses of ilmenite and apatite demonstrate the strong influence of accessory phases on trace element distribution. Apatite contains high amounts of REE and Sr while ilmenite incorporates about 20-30 times higher amounts of Nb and Ta than amphibole. Furthermore, trace element mineral analyses provide evidence for magmatic processes such as melt depletion, melt segregation, accumulation and fractionation as well as metasomatism having operated in this high-grade anatectic area.
Resumo:
Biomedical analyses are becoming increasingly complex, with respect to both the type of the data to be produced and the procedures to be executed. This trend is expected to continue in the future. The development of information and protocol management systems that can sustain this challenge is therefore becoming an essential enabling factor for all actors in the field. The use of custom-built solutions that require the biology domain expert to acquire or procure software engineering expertise in the development of the laboratory infrastructure is not fully satisfactory because it incurs undesirable mutual knowledge dependencies between the two camps. We propose instead an infrastructure concept that enables the domain experts to express laboratory protocols using proper domain knowledge, free from the incidence and mediation of the software implementation artefacts. In the system that we propose this is made possible by basing the modelling language on an authoritative domain specific ontology and then using modern model-driven architecture technology to transform the user models in software artefacts ready for execution in a multi-agent based execution platform specialized for biomedical laboratories.
Resumo:
In this study the population structure and connectivity of the Mediterranean and Atlantic Raja clavata (L., 1758) were investigated by analyzing the genetic variation of six population samples (N = 144) at seven nuclear microsatellite loci. The genetic dataset was generated by selecting population samples available in the tissue databases of the GenoDREAM laboratory (University of Bologna) and of the Department of Life Sciences and Environment (University of Cagliari), all collected during past scientific surveys (MEDITS, GRUND) from different geographical locations in the Mediterranean basin and North-east Atlantic sea, as North Sea, Sardinian coasts, Tuscany coasts and Cyprus Island. This thesis deals with to estimate the genetic diversity and differentiation among 6 geographical samples, in particular, to assess the presence of any barrier (geographic, hydrogeological or biological) to gene flow evaluating both the genetic diversity (nucleotide diversity, observed and expected heterozygosity, Hardy- Weinberg equilibrium analysis) and population differentiation (Fst estimates, population structure analysis). In addition to molecular analysis, quantitative representation and statistical analysis of morphological individuals shape are performed using geometric morphometrics methods and statistical tests. Geometric coordinates call landmarks are fixed in 158 individuals belonging to two population samples of Raja clavata and in population samples of closely related species, Raja straeleni (cryptic sibling) and Raja asterias, to assess significant morphological differences at multiple taxonomic levels. The results obtained from the analysis of the microsatellite dataset suggested a geographic and genetic separation between populations from Central-Western and Eastern Mediterranean basins. Furthermore, the analysis also showed that there was no separation between geographic samples from North Atlantic Ocean and central-Western Mediterranean, grouping them to a panmictic population. The Landmark-based geometric morphometry method results showed significant differences of body shape able to discriminate taxa at tested levels (from species to populations).
Resumo:
Coral reefs are the most biodiverse ecosystems of the ocean and they provide notable ecosystem services. Nowadays, they are facing a number of local anthropogenic threats and environmental change is threatening their survivorship on a global scale. Large-scale monitoring is necessary to understand environmental changes and to perform useful conservation measurements. Governmental agencies are often underfunded and are not able of sustain the necessary spatial and temporal large-scale monitoring. To overcome the economic constrains, in some cases scientists can engage volunteers in environmental monitoring. Citizen Science enables the collection and analysis of scientific data at larger spatial and temporal scales than otherwise possible, addressing issues that are otherwise logistically or financially unfeasible. “STE: Scuba Tourism for the Environment” was a volunteer-based Red Sea coral reef biodiversity monitoring program. SCUBA divers and snorkelers were involved in the collection of data for 72 taxa, by completing survey questionnaires after their dives. In my thesis, I evaluated the reliability of the data collected by volunteers, comparing their questionnaires with those completed by professional scientists. Validation trials showed a sufficient level of reliability, indicating that non-specialists performed similarly to conservation volunteer divers on accurate transects. Using the data collected by volunteers, I developed a biodiversity index that revealed spatial trends across surveyed areas. The project results provided important feedbacks to the local authorities on the current health status of Red Sea coral reefs and on the effectiveness of the environmental management. I also analysed the spatial and temporal distribution of each surveyed taxa, identifying abundance trends related with anthropogenic impacts. Finally, I evaluated the effectiveness of the project to increase the environmental education of volunteers and showed that the participation in STEproject significantly increased both the knowledge on coral reef biology and ecology and the awareness of human behavioural impacts on the environment.
Resumo:
Data sets describing the state of the earth's atmosphere are of great importance in the atmospheric sciences. Over the last decades, the quality and sheer amount of the available data increased significantly, resulting in a rising demand for new tools capable of handling and analysing these large, multidimensional sets of atmospheric data. The interdisciplinary work presented in this thesis covers the development and the application of practical software tools and efficient algorithms from the field of computer science, aiming at the goal of enabling atmospheric scientists to analyse and to gain new insights from these large data sets. For this purpose, our tools combine novel techniques with well-established methods from different areas such as scientific visualization and data segmentation. In this thesis, three practical tools are presented. Two of these tools are software systems (Insight and IWAL) for different types of processing and interactive visualization of data, the third tool is an efficient algorithm for data segmentation implemented as part of Insight.Insight is a toolkit for the interactive, three-dimensional visualization and processing of large sets of atmospheric data, originally developed as a testing environment for the novel segmentation algorithm. It provides a dynamic system for combining at runtime data from different sources, a variety of different data processing algorithms, and several visualization techniques. Its modular architecture and flexible scripting support led to additional applications of the software, from which two examples are presented: the usage of Insight as a WMS (web map service) server, and the automatic production of a sequence of images for the visualization of cyclone simulations. The core application of Insight is the provision of the novel segmentation algorithm for the efficient detection and tracking of 3D features in large sets of atmospheric data, as well as for the precise localization of the occurring genesis, lysis, merging and splitting events. Data segmentation usually leads to a significant reduction of the size of the considered data. This enables a practical visualization of the data, statistical analyses of the features and their events, and the manual or automatic detection of interesting situations for subsequent detailed investigation. The concepts of the novel algorithm, its technical realization, and several extensions for avoiding under- and over-segmentation are discussed. As example applications, this thesis covers the setup and the results of the segmentation of upper-tropospheric jet streams and cyclones as full 3D objects. Finally, IWAL is presented, which is a web application for providing an easy interactive access to meteorological data visualizations, primarily aimed at students. As a web application, the needs to retrieve all input data sets and to install and handle complex visualization tools on a local machine are avoided. The main challenge in the provision of customizable visualizations to large numbers of simultaneous users was to find an acceptable trade-off between the available visualization options and the performance of the application. Besides the implementational details, benchmarks and the results of a user survey are presented.
Resumo:
In vielen Industriezweigen, zum Beispiel in der Automobilindustrie, werden Digitale Versuchsmodelle (Digital MockUps) eingesetzt, um die Konstruktion und die Funktion eines Produkts am virtuellen Prototypen zu überprüfen. Ein Anwendungsfall ist dabei die Überprüfung von Sicherheitsabständen einzelner Bauteile, die sogenannte Abstandsanalyse. Ingenieure ermitteln dabei für bestimmte Bauteile, ob diese in ihrer Ruhelage sowie während einer Bewegung einen vorgegeben Sicherheitsabstand zu den umgebenden Bauteilen einhalten. Unterschreiten Bauteile den Sicherheitsabstand, so muss deren Form oder Lage verändert werden. Dazu ist es wichtig, die Bereiche der Bauteile, welche den Sicherhabstand verletzen, genau zu kennen. rnrnIn dieser Arbeit präsentieren wir eine Lösung zur Echtzeitberechnung aller den Sicherheitsabstand unterschreitenden Bereiche zwischen zwei geometrischen Objekten. Die Objekte sind dabei jeweils als Menge von Primitiven (z.B. Dreiecken) gegeben. Für jeden Zeitpunkt, in dem eine Transformation auf eines der Objekte angewendet wird, berechnen wir die Menge aller den Sicherheitsabstand unterschreitenden Primitive und bezeichnen diese als die Menge aller toleranzverletzenden Primitive. Wir präsentieren in dieser Arbeit eine ganzheitliche Lösung, welche sich in die folgenden drei großen Themengebiete unterteilen lässt.rnrnIm ersten Teil dieser Arbeit untersuchen wir Algorithmen, die für zwei Dreiecke überprüfen, ob diese toleranzverletzend sind. Hierfür präsentieren wir verschiedene Ansätze für Dreiecks-Dreiecks Toleranztests und zeigen, dass spezielle Toleranztests deutlich performanter sind als bisher verwendete Abstandsberechnungen. Im Fokus unserer Arbeit steht dabei die Entwicklung eines neuartigen Toleranztests, welcher im Dualraum arbeitet. In all unseren Benchmarks zur Berechnung aller toleranzverletzenden Primitive beweist sich unser Ansatz im dualen Raum immer als der Performanteste.rnrnDer zweite Teil dieser Arbeit befasst sich mit Datenstrukturen und Algorithmen zur Echtzeitberechnung aller toleranzverletzenden Primitive zwischen zwei geometrischen Objekten. Wir entwickeln eine kombinierte Datenstruktur, die sich aus einer flachen hierarchischen Datenstruktur und mehreren Uniform Grids zusammensetzt. Um effiziente Laufzeiten zu gewährleisten ist es vor allem wichtig, den geforderten Sicherheitsabstand sinnvoll im Design der Datenstrukturen und der Anfragealgorithmen zu beachten. Wir präsentieren hierzu Lösungen, die die Menge der zu testenden Paare von Primitiven schnell bestimmen. Darüber hinaus entwickeln wir Strategien, wie Primitive als toleranzverletzend erkannt werden können, ohne einen aufwändigen Primitiv-Primitiv Toleranztest zu berechnen. In unseren Benchmarks zeigen wir, dass wir mit unseren Lösungen in der Lage sind, in Echtzeit alle toleranzverletzenden Primitive zwischen zwei komplexen geometrischen Objekten, bestehend aus jeweils vielen hunderttausend Primitiven, zu berechnen. rnrnIm dritten Teil präsentieren wir eine neuartige, speicheroptimierte Datenstruktur zur Verwaltung der Zellinhalte der zuvor verwendeten Uniform Grids. Wir bezeichnen diese Datenstruktur als Shrubs. Bisherige Ansätze zur Speicheroptimierung von Uniform Grids beziehen sich vor allem auf Hashing Methoden. Diese reduzieren aber nicht den Speicherverbrauch der Zellinhalte. In unserem Anwendungsfall haben benachbarte Zellen oft ähnliche Inhalte. Unser Ansatz ist in der Lage, den Speicherbedarf der Zellinhalte eines Uniform Grids, basierend auf den redundanten Zellinhalten, verlustlos auf ein fünftel der bisherigen Größe zu komprimieren und zur Laufzeit zu dekomprimieren.rnrnAbschießend zeigen wir, wie unsere Lösung zur Berechnung aller toleranzverletzenden Primitive Anwendung in der Praxis finden kann. Neben der reinen Abstandsanalyse zeigen wir Anwendungen für verschiedene Problemstellungen der Pfadplanung.
Resumo:
The recent increase in the amount of nanoparticles incorporated into commercial products is accompanied by a rising concern of the fate of these nanoparticles. Once released into the environment, it is inevitable that the nanoparticles will come into contact with the soil, introducing them to various routes of environmental contamination. One route that was explored in this research was the interaction between nanoparticles and clay minerals. In order to better define the interactions between clay minerals and positively charged nanoparticles, in situ atomic force microscopy (AFM) was utilized. In situ AFM experiments allowed interactions between clay minerals and positively charged nanoparticles to be observed in real time. The preliminary results demonstrated that in situ AFM was a reliable technique for studying the interactions between clay minerals and positively charged nanoparticles and showed that the nanoparticles affected the swelling (height) of the clay quasi-crystals upon exposure. The preliminary AFM data were complemented by batch study experiments which measured the absorbance of the nanoparticle filtrate after introduction to clay minerals in an effort to better determine the mobility of the positively charged nanoparticles in an environment with significant clay contribution. The results of the batch study indicated that the interactions between clay minerals and positively charged nanoparticles were size dependent and that the interactions of the different size nanoparticles with the clay may be occurring to different degrees. The degree to which the different size nanoparticles were interacting with the clay was further probed using FTIR (Fourier transform infrared) spectroscopy experiments. The results of these experiments showed that interactions between clay minerals and positively charged nanoparticles were size dependent as indicated by a change in the FTIR spectra of the nanoparticles upon introduction to clay.
Resumo:
The project aimed to use results of contamination of city vegetation with heavy metals and sulphur compounds as the basis for analysing the integral response of trees and shrubs to contamination, through a complex method of phytoindication. The results were used to draw up recommendations on pollution reduction in the city and to develop the method of phytoindication as a means of monitoring environmental pollution in St. Petersburg and other large cities. Field investigations were carried out in August 1996, and 66 descriptions of green areas were made in order to estimate the functional state of plants in the Vasileostrovsky district. Investigations of the spectrum reflecting properties of plants showed considerable variation of albedo meanings of leaves under the influence of various internal and external factors. The results indicated that lime trees most closely reflect the condition of the environment. Practically all the green areas studied were in poor condition, the only exceptions being areas of ash trees, which are more resistant to environmental pollution, and one lime-tree alley in a comparatively unpolluted street. The study identified those types of trees which are more or less resistant to complex environmental pollution and Ms. Terekhina recommends that the species in the present green areas be changed to include a higher number of the more resistant species. The turbidimetric analysis of tree barks for sulphates gave an indication of the level and spatial distribution of each pollutant, and the results also confirmed other findings that electric conductivity is a significant feature in determining the extent of sulphate pollution. In testing for various metals, the lime tree showed the highest contents for all elements except magnesium, copper, zinc, cadmium and strontium, again confirming the species' vulnerability to pollution. Medium rates of concentration in the city and environs showed that city plants concentrate 3 times as many different elements and 10 times more chromium, copper and lead than do those in the suburbs. The second stage of the study was based on the concept of phytoindication, which presupposes that changes in the relation of chemical elements in regional biological circulation under the influence of technogenesis provide a criterion for predicting displacements in people's health. There are certain basic factors in this concept. The first is that all living beings are related ecologically as well as by their evolutionary origin, and that the lower an organism is on the evolutionary scale, the less adaptational reserve it has. The second is that smaller concentrations of chemical elements are needed for toxicological influence on plants than on people and so the former's reactions to geochemical factors are easier to characterise. Visual indicational features of urban plants are well defined and can form the basis of a complex "environment - public health" analysis. Specific plant reactions reflecting atmospheric pollution and other components of urbogeosystems make it possible to determine indication criteria for predicting possible disturbances in the general state of health of the population. Thirdly the results of phytoindication investigations must be taken together with information about public health in the area. It only proved possibly to analyse general indexes of public health based on statistical data from the late 1980s and early 1990s as the data of later years were greatly influenced by social factors. These data show that the rates of illness in St. Petersburg (especially for children) are higher than in Russia as a whole, for most classes of diseases, indicating that the population there is more sensitive to the ecological state of the urban environment. The Vasileostrovsky district had the second highest sick rate for adullts, while the rate of infant mortality in the first year of life was highest there. Ms. Terekhina recommends further studies to more precisely assess the effectiveness of the methods she tested, but has drawn up a proposed map of environmental hazard for the population, taking into account prevailing wind directions.