752 resultados para Engineering -- Data processing -- Study and teaching
Resumo:
Human activities represent a significant burden on the global water cycle, with large and increasing demands placed on limited water resources by manufacturing, energy production and domestic water use. In addition to changing the quantity of available water resources, human activities lead to changes in water quality by introducing a large and often poorly-characterized array of chemical pollutants, which may negatively impact biodiversity in aquatic ecosystems, leading to impairment of valuable ecosystem functions and services. Domestic and industrial wastewaters represent a significant source of pollution to the aquatic environment due to inadequate or incomplete removal of chemicals introduced into waters by human activities. Currently, incomplete chemical characterization of treated wastewaters limits comprehensive risk assessment of this ubiquitous impact to water. In particular, a significant fraction of the organic chemical composition of treated industrial and domestic wastewaters remains uncharacterized at the molecular level. Efforts aimed at reducing the impacts of water pollution on aquatic ecosystems critically require knowledge of the composition of wastewaters to develop interventions capable of protecting our precious natural water resources.
The goal of this dissertation was to develop a robust, extensible and high-throughput framework for the comprehensive characterization of organic micropollutants in wastewaters by high-resolution accurate-mass mass spectrometry. High-resolution mass spectrometry provides the most powerful analytical technique available for assessing the occurrence and fate of organic pollutants in the water cycle. However, significant limitations in data processing, analysis and interpretation have limited this technique in achieving comprehensive characterization of organic pollutants occurring in natural and built environments. My work aimed to address these challenges by development of automated workflows for the structural characterization of organic pollutants in wastewater and wastewater impacted environments by high-resolution mass spectrometry, and to apply these methods in combination with novel data handling routines to conduct detailed fate studies of wastewater-derived organic micropollutants in the aquatic environment.
In Chapter 2, chemoinformatic tools were implemented along with novel non-targeted mass spectrometric analytical methods to characterize, map, and explore an environmentally-relevant “chemical space” in municipal wastewater. This was accomplished by characterizing the molecular composition of known wastewater-derived organic pollutants and substances that are prioritized as potential wastewater contaminants, using these databases to evaluate the pollutant-likeness of structures postulated for unknown organic compounds that I detected in wastewater extracts using high-resolution mass spectrometry approaches. Results showed that application of multiple computational mass spectrometric tools to structural elucidation of unknown organic pollutants arising in wastewaters improved the efficiency and veracity of screening approaches based on high-resolution mass spectrometry. Furthermore, structural similarity searching was essential for prioritizing substances sharing structural features with known organic pollutants or industrial and consumer chemicals that could enter the environment through use or disposal.
I then applied this comprehensive methodological and computational non-targeted analysis workflow to micropollutant fate analysis in domestic wastewaters (Chapter 3), surface waters impacted by water reuse activities (Chapter 4) and effluents of wastewater treatment facilities receiving wastewater from oil and gas extraction activities (Chapter 5). In Chapter 3, I showed that application of chemometric tools aided in the prioritization of non-targeted compounds arising at various stages of conventional wastewater treatment by partitioning high dimensional data into rational chemical categories based on knowledge of organic chemical fate processes, resulting in the classification of organic micropollutants based on their occurrence and/or removal during treatment. Similarly, in Chapter 4, high-resolution sampling and broad-spectrum targeted and non-targeted chemical analysis were applied to assess the occurrence and fate of organic micropollutants in a water reuse application, wherein reclaimed wastewater was applied for irrigation of turf grass. Results showed that organic micropollutant composition of surface waters receiving runoff from wastewater irrigated areas appeared to be minimally impacted by wastewater-derived organic micropollutants. Finally, Chapter 5 presents results of the comprehensive organic chemical composition of oil and gas wastewaters treated for surface water discharge. Concurrent analysis of effluent samples by complementary, broad-spectrum analytical techniques, revealed that low-levels of hydrophobic organic contaminants, but elevated concentrations of polymeric surfactants, which may effect the fate and analysis of contaminants of concern in oil and gas wastewaters.
Taken together, my work represents significant progress in the characterization of polar organic chemical pollutants associated with wastewater-impacted environments by high-resolution mass spectrometry. Application of these comprehensive methods to examine micropollutant fate processes in wastewater treatment systems, water reuse environments, and water applications in oil/gas exploration yielded new insights into the factors that influence transport, transformation, and persistence of organic micropollutants in these systems across an unprecedented breadth of chemical space.
Resumo:
Online Social Network (OSN) services provided by Internet companies bring people together to chat, share the information, and enjoy the information. Meanwhile, huge amounts of data are generated by those services (they can be regarded as the social media ) every day, every hour, even every minute, and every second. Currently, researchers are interested in analyzing the OSN data, extracting interesting patterns from it, and applying those patterns to real-world applications. However, due to the large-scale property of the OSN data, it is difficult to effectively analyze it. This dissertation focuses on applying data mining and information retrieval techniques to mine two key components in the social media data — users and user-generated contents. Specifically, it aims at addressing three problems related to the social media users and contents: (1) how does one organize the users and the contents? (2) how does one summarize the textual contents so that users do not have to go over every post to capture the general idea? (3) how does one identify the influential users in the social media to benefit other applications, e.g., Marketing Campaign? The contribution of this dissertation is briefly summarized as follows. (1) It provides a comprehensive and versatile data mining framework to analyze the users and user-generated contents from the social media. (2) It designs a hierarchical co-clustering algorithm to organize the users and contents. (3) It proposes multi-document summarization methods to extract core information from the social network contents. (4) It introduces three important dimensions of social influence, and a dynamic influence model for identifying influential users.
Resumo:
Global warming is expected to be most pronounced in the Arctic where permafrost thaw and release of old carbon may provide an important feedback mechanism to the climate system. To better understand and predict climate effects and feedbacks on the cycling of elements within and between ecosystems in northern latitude landscapes, a thorough understanding of the processes related to transport and cycling of elements is required. A fundamental requirement to reach a better process understanding is to have access to high-quality empirical data on chemical concentrations and biotic properties for a wide range of ecosystem domains and functional units (abiotic and biotic pools). The aim of this study is therefore to make one of the most extensive field data sets from a periglacial catchment readily available that can be used both to describe present-day periglacial processes and to improve predictions of the future. Here we present the sampling and analytical methods, field and laboratory equipment and the resulting biogeochemical data from a state-of-the-art whole-ecosystem investigation of the terrestrial and aquatic parts of a lake catchment in the Kangerlussuaq region, West Greenland. This data set allows for the calculation of whole-ecosystem mass balance budgets for a long list of elements, including carbon, nutrients and major and trace metals.
Resumo:
The presented thesis was written in the frame of a project called 'seepage water prognosis'. It was funded by the Federal Ministry for Education and Science (BMBF). 41 German institutions among them research institutes of universities, public authorities and engineering companies were financed for three years respectively. The aim was to work out the scientific basis that is needed to carry out a seepage water prognosis (Oberacker und Eberle, 2002). According to the Federal German Soil Protection Act (Federal Bulletin, 1998) a seepage water prognosis is required in order to avoid future soil impacts from the application of recycling products. The participants focused on the development of either methods to determine the source strength of the materials investigated, which is defined as the total mass flow caused by natural leaching or on models to predict the contaminants transport through the underlying soil. Annual meetings of all participants as well as separate meetings of the two subprojects were held. The department of Geosciences in Bremen participated with two subprojects. The aim of the subproject that resulted in this thesis was the development of easily applicable, valid, and generally accepted laboratory methods for the determination of the source strength. In the scope of the second subproject my colleague Veith Becker developed a computer model for the transport prognosis with the source strength as the main input parameter.
Field data, numerical simulations and probability analyses to assess lava flow hazards at Mount Etna
Resumo:
Improving lava flow hazard assessment is one of the most important and challenging fields of volcanology, and has an immediate and practical impact on society. Here, we present a methodology for the quantitative assessment of lava flow hazards based on a combination of field data, numerical simulations and probability analyses. With the extensive data available on historic eruptions of Mt. Etna, going back over 2000 years, it has been possible to construct two hazard maps, one for flank and the other for summit eruptions, allowing a quantitative analysis of the most likely future courses of lava flows. The effective use of hazard maps of Etna may help in minimizing the damage from volcanic eruptions through correct land use in densely urbanized area with a population of almost one million people. Although this study was conducted on Mt. Etna, the approach used is designed to be applicable to other volcanic areas.
Resumo:
Requirements Engineering (RE) has received much attention in research and practice due to its importance to software project success. Its inter-disciplinary nature, the dependency to the customer, and its inherent uncertainty still render the discipline diffcult to investigate. This results in a lack of empirical data. These are necessary, however, to demonstrate which practically relevant RE problems exist and to what extent they matter. Motivated by this situation, we initiated the Naming the Pain in Requirements Engineering (NaPiRE) initiative which constitutes a globally distributed, bi-yearly replicated family of surveys on the status quo and problems in practical RE.
In this article, we report on the analysis of data obtained from 228 companies in 10 countries. We apply Grounded Theory to the data obtained from NaPiRE and reveal which contemporary problems practitioners encounter. To this end, we analyse 21 problems derived from the literature with respect to their relevance and criticality in dependency to their context, and we complement this picture with a cause-effect analysis showing the causes and effects surrounding the most critical problems.
Our results give us a better understanding of which problems exist and how they manifest themselves in practical environments. Thus, we provide a rst step to ground contributions to RE on empirical observations which, by now, were dominated by conventional wisdom only.
Resumo:
Electrospun nanofibers are a promising material for ligamentous tissue engineering, however weak mechanical properties of fibers to date have limited their clinical usage. The goal of this work was to modify electrospun nanofibers to create a robust structure that mimics the complex hierarchy of native tendons and ligaments. The scaffolds that were fabricated in this study consisted of either random or aligned nanofibers in flat sheets or rolled nanofiber bundles that mimic the size scale of fascicle units in primarily tensile load bearing soft musculoskeletal tissues. Altering nanofiber orientation and geometry significantly affected mechanical properties; most notably aligned nanofiber sheets had the greatest modulus; 125% higher than that of random nanofiber sheets; and 45% higher than aligned nanofiber bundles. Modifying aligned nanofiber sheets to form aligned nanofiber bundles also resulted in approximately 107% higher yield stresses and 140% higher yield strains. The mechanical properties of aligned nanofiber bundles were in the range of the mechanical properties of the native ACL: modulus=158±32MPa, yield stress=57±23MPa and yield strain=0.38±0.08. Adipose derived stem cells cultured on all surfaces remained viable and proliferated extensively over a 7 day culture period and cells elongated on nanofiber bundles. The results of the study suggest that aligned nanofiber bundles may be useful for ligament and tendon tissue engineering based on their mechanical properties and ability to support cell adhesion, proliferation, and elongation.
Resumo:
Ce travail évalue le comportement mécanique des matériaux cimentaires à différentes échelles de distance. Premièrement, les propriétés mécaniques du béton produit avec un bioplastifiant à base de microorganismes efficaces (EM) sont etudiées par nanoindentation statistique, et comparées aux propriétés mécaniques du béton produit avec un superplastifiant ordinaire (SP). Il est trouvé que l’ajout de bioplastifiant à base de produit EM améliore la résistance des C–S–H en augmentant la cohésion et la friction des nanograins solides. L’analyse statistique des résultats d’indentation suggère que le bioplastifiant à base de produit EM inhibe la précipitation des C–S–H avec une plus grande fraction volumique solide. Deuxièmement, un modèle multi-échelles à base micromécanique est dérivé pour le comportement poroélastique de la pâte de ciment au jeune age. L’approche proposée permet d’obtenir les propriétés poroélastiques requises pour la modélisation du comportoment mécanique partiellement saturé des pâtes de ciment viellissantes. Il est montré que ce modèle prédit le seuil de percolation et le module de Young non drainé de façon conforme aux données expérimentales. Un metamodèle stochastique est construit sur la base du chaos polynomial pour propager l’incertitude des paramètres du modèle à travers plusieurs échelles de distance. Une analyse de sensibilité est conduite par post-traitement du metamodèle pour des pâtes de ciment avec ratios d’eau sur ciment entre 0.35 et 0.70. Il est trouvé que l’incertitude sous-jacente des propriétés poroélastiques équivalentes est principalement due à l’énergie d’activation des aluminates de calcium au jeune age et, plus tard, au module élastique des silicates de calcium hydratés de basse densité.
Resumo:
Computer-based simulation games (CSG) are a form of innovation in learning and teaching. CGS are used more pervasively in various ways such as a class activity (formative exercises) and as part of summative assessments (Leemkuil and De Jong, 2012; Zantow et al., 2005). This study investigates the current and potential use of CGS in Worcester Business School’s (WBS) Business Management undergraduate programmes. The initial survey of off-the-shelf simulation reveals that there are various categories of simulations, with each offering varying levels of complexity and learning opportunities depending on the field of study. The findings suggest that whilst there is marginal adoption of the use CSG in learning and teaching, there is significant opportunity to increase the use of CSG in enhancing learning and learner achievement, especially in Level 5 modules. The use of CSG is situational and its adoption should be undertaken on a case-by-case basis. WBS can play a major role by creating an environment that encourages and supports the use of CSG as well as other forms of innovative learning and teaching methods. Thus the key recommendation involves providing module teams further support in embedding and integrating CSG into their modules.
Resumo:
The Data Processing Department of ISHC has developed coding forms to be used for the data to be entered into the program. The Highway Planning and Programming and the Design Departments are responsible for coding and submitting the necessary data forms to Data Processing for the noise prediction on the highway sections.
Resumo:
Developing a theoretical framework for pervasive information environments is an enormous goal. This paper aims to provide a small step towards such a goal. The following pages report on our initial investigations to devise a framework that will continue to support locative, experiential and evaluative data from ‘user feedback’ in an increasingly pervasive information environment. We loosely attempt to outline this framework by developing a methodology capable of moving from rapid-deployment of software and hardware technologies, towards a goal of realistic immersive experience of pervasive information. We propose various technical solutions and address a range of problems such as; information capture through a novel model of sensing, processing, visualization and cognition.
Resumo:
The catastrophic event of red tide has happened in the Strait of Hormuz, the Persian Gulf and Gulf of Oman from late summer 2008 to spring 2009. With its devastating effects, the phenomenon shocked all the countries located in the margin of the Persian Gulf and the Gulf of Oman and caused considerable losses to fishery industries, tourism, and tourist and trade economy of the region. In the maritime cruise carried out by the Persian Gulf and Gulf of Oman Ecological Research Institute, field data, including temperature, salinity, chlorophyll-a, dissolved oxygen and algal density were obtained for this research. Satellite information was received from MODIS and MERIS and SeaWiFS sensors. Temperature and surface chlorophyll images were obtained and compared with the field data and data of PROBE model. The results obtained from the present research indicated that with the occurrence of harmful algal blooms (HAB), the Chlorophyll-a and the dissolved oxygen contents increased in the surface water. Maximum algal density was seen in the northern coasts of the Strait of Hormuz. Less concentration of algal density was detected in deep and surface offshore water. Our results show that the occurred algal bloom was the result of seawater temperature drop, water circulation and the adverse environmental pollutions caused by industrial and urban sewages entering the coastal waters in this region of the Persian Gulf ,This red tide phenomenon was started in the Strait of Hormuz and eventually covered about 140,000 km2 of the Persian Gulf and total area of Strait of Hormuz and it survived for 10 months which is a record amongst the occurred algal blooms across the world. Temperature and chlorophyll satellite images were proportionate to the measured values obtained by the field method. This indicates that satellite measurements have acceptable precisions and they can be used in sea monitoring and modeling.
Resumo:
With the exponential growth of the usage of web-based map services, the web GIS application has become more and more popular. Spatial data index, search, analysis, visualization and the resource management of such services are becoming increasingly important to deliver user-desired Quality of Service. First, spatial indexing is typically time-consuming and is not available to end-users. To address this, we introduce TerraFly sksOpen, an open-sourced an Online Indexing and Querying System for Big Geospatial Data. Integrated with the TerraFly Geospatial database [1-9], sksOpen is an efficient indexing and query engine for processing Top-k Spatial Boolean Queries. Further, we provide ergonomic visualization of query results on interactive maps to facilitate the user’s data analysis. Second, due to the highly complex and dynamic nature of GIS systems, it is quite challenging for the end users to quickly understand and analyze the spatial data, and to efficiently share their own data and analysis results with others. Built on the TerraFly Geo spatial database, TerraFly GeoCloud is an extra layer running upon the TerraFly map and can efficiently support many different visualization functions and spatial data analysis models. Furthermore, users can create unique URLs to visualize and share the analysis results. TerraFly GeoCloud also enables the MapQL technology to customize map visualization using SQL-like statements [10]. Third, map systems often serve dynamic web workloads and involve multiple CPU and I/O intensive tiers, which make it challenging to meet the response time targets of map requests while using the resources efficiently. Virtualization facilitates the deployment of web map services and improves their resource utilization through encapsulation and consolidation. Autonomic resource management allows resources to be automatically provisioned to a map service and its internal tiers on demand. v-TerraFly are techniques to predict the demand of map workloads online and optimize resource allocations, considering both response time and data freshness as the QoS target. The proposed v-TerraFly system is prototyped on TerraFly, a production web map service, and evaluated using real TerraFly workloads. The results show that v-TerraFly can accurately predict the workload demands: 18.91% more accurate; and efficiently allocate resources to meet the QoS target: improves the QoS by 26.19% and saves resource usages by 20.83% compared to traditional peak load-based resource allocation.
Resumo:
The purpose of this paper is to present the results of two online forums carried out with the participation of 42 students of the Licenciaturas in Preschool Education, Primary Education and Secondary Education of the University of Costa Rica. The main purpose of the forums was to determine the insights of the participant students about the competencies they have achieved in the field of education research, and which have been the essential tools for them to systematize their own teaching practices. The discussion forums were part of the course FD5091 Métodos de Investigación Educativa [Education Research Methods] of the School of Teacher Education, delivered from March-April 2010. Of the sample, 60 percent were students of the Preschool teaching program, 35 percent were from the Primary Education teaching program and 5 percent were from the Secondary Education teaching program in the fields of Science, Mathematics and Social Studies. According to the insights and beliefs showed by the participants –both, the future teachers and the profession practitioners–, there are no opportunities for research or systematization of their own teaching mediation, in the current work situation.(1) Translator’s Note: In Costa Rica, the “Licenciatura” is a one-year post-Bachelor study program, usually including thesis. “Primary Education” refers to students from the 1st to 6th grades, and “Secondary Education” refers to students from the 7th to 11th grades.