942 resultados para Hydroinformatics and Data Innovative Aspects on Teaching
Resumo:
One hundred and thirty four subjects participated in this survey. Quantitative data were obtained and correlational analyses were used to test a model to study the relationships among the achievement of work values and organizational commitment and job satisfaction and to identify the moderating effects of the meaningfulness of work and responsibility for work on these relationships. Part-time faculty in the Faculty of Continuing Education of a community college were mailed a questionnaire on all the variables of the model. Several reliable, valid instruments were used to test the variables. Data analysis through Pearson correlation and stepwise multiple regression analyses revealed that the achievement of the work values of recognition and satisfaction with promotions did predict organizational commitment and job satisfaction, although the moderating effects of the meaningfulness of work and responsibility for work was not supported in this study. This study suggests that the revised model may be used for determining the relationships between the achievement of work values and organizational commitment and job satisfaction in a community college setting.
Resumo:
The effect that plants {Typha latifolia) as well as root-bed medium physical and chemical characteristics have on the treatment of primary treated domestic wastewater within a vertical flow constructed wetland system was investigated. Five sets of cells, with two cells in each set, were used. Each cell was made of concrete and measured 1 .0 m X 1 .0 m and was 1.3 m deep. Four different root-bed media were tested : Queenston Shale, Fonthill Sand, Niagara Shale and a Michigan Sand. Four of the sets contained plants and a single type of root-bed medium. The influence of plants was tested by operating a Queenston Shale set without plants. Due to budget constraints no replicates were constructed. All of the sets were operated independently and identically for twenty-eight months. Twelve months of data are presented here, collected after 16 months of continuous operation. Root-bed medium type did not influence BOD5 removal. All of the sets consistently met Ontario Ministry of Environment (MOE) requirements (<25 mg/L) for BOD5 throughout the year. The 12 month average BOD5 concentration from all sets with plants was below 2.36 mg/L. All of the sets were within MOE discharge requirements (< 25 mg/L) for suspended solids with set effluent concentrations ranging from 1.53 to 14.80 mg/L. The Queenston Shale and Fonthill Sand media removed the most suspended solids while the Niagara Shale set produced suspended solids. The set containing Fonthill Sand was the only series to meet MOE discharge requirements (< Img/L) for total phosphorus year-round with a twelve month mean effluent concentration of 0.23 mg/L. Year-round all of the root-bed media were well below MOE discharge requirements (< 20mg/L in winter and < 10 mg/L in sumnner) for ammonium. The Queenston Shale and Fonthill Sand sets removed the most total nitrogen. Plants had no effect on total nitrogen removal, but did influence how nitrogen was cycled within the system. Plants increased the removal of suspended solids by 14%, BOD5 by 10% and total phosphorus by 22%. Plants also increased the amount of dissolved oxygen that entered the system. During the plant growing season removal of total phosphorus was better in all sets with plants regardless of media type. The sets containing Queenston Shale and Fonthill Sand media achieved the best results and plants in the Queenston Shale set increased treatment efficiency for every parameter except nitrogen. Vertical flow wetland sewage treatment systems can be designed and built to consistently meet MOE discharge requirements year-round for BOD5, suspended solids, total phosphorus and ammonium. This system Is generally superior to the free water systems and sub-surface horizontal flow systems in cold climate situations.
Resumo:
Complex social-cognitive deficits are common in individuals diagnosed with high functioning autism and Asperger syndrome. Research on effective and evidence-based social interventions is needed for this population. This study focused specifically on the challenges these individuals face with respect to flexible thinking and related flexible behaviour in social situations. Madrigal and Winner's (2008) Superflex curriculum - targets social flexibility, however at the time of this study no published research had been conducted to determine the effectiveness of this approach. This study was a pilot study, which sought to examine the impact of the Superflex curriculum within a 10-week training program in teaching one individual with high functioning autism how to think and behave flexibly in social situations. Multiple measurement tools were utilized, and analyses within and across the measures revealed inconsistencies, especially with respect to generalization. Although preliminary, this study provided valuable information for subsequent research.
Resumo:
Global Positioning System (GPS), with its high integrity, continuous availability and reliability, revolutionized the navigation system based on radio ranging. With four or more GPS satellites in view, a GPS receiver can find its location anywhere over the globe with accuracy of few meters. High accuracy - within centimeters, or even millimeters is achievable by correcting the GPS signal with external augmentation system. The use of satellite for critical application like navigation has become a reality through the development of these augmentation systems (like W AAS, SDCM, and EGNOS, etc.) with a primary objective of providing essential integrity information needed for navigation service in their respective regions. Apart from these, many countries have initiated developing space-based regional augmentation systems like GAGAN and IRNSS of India, MSAS and QZSS of Japan, COMPASS of China, etc. In future, these regional systems will operate simultaneously and emerge as a Global Navigation Satellite System or GNSS to support a broad range of activities in the global navigation sector.Among different types of error sources in the GPS precise positioning, the propagation delay due to the atmospheric refraction is a limiting factor on the achievable accuracy using this system. The WADGPS, aimed for accurate positioning over a large area though broadcasts different errors involved in GPS ranging including ionosphere and troposphere errors, due to the large temporal and spatial variations in different atmospheric parameters especially in lower atmosphere (troposphere), the use of these broadcasted tropospheric corrections are not sufficiently accurate. This necessitated the estimation of tropospheric error based on realistic values of tropospheric refractivity. Presently available methodologies for the estimation of tropospheric delay are mostly based on the atmospheric data and GPS measurements from the mid-latitude regions, where the atmospheric conditions are significantly different from that over the tropics. No such attempts were made over the tropics. In a practical approach when the measured atmospheric parameters are not available analytical models evolved using data from mid-latitudes for this purpose alone can be used. The major drawback of these existing models is that it neglects the seasonal variation of the atmospheric parameters at stations near the equator. At tropics the model underestimates the delay in quite a few occasions. In this context, the present study is afirst and major step towards the development of models for tropospheric delay over the Indian region which is a prime requisite for future space based navigation program (GAGAN and IRNSS). Apart from the models based on the measured surface parameters, a region specific model which does not require any measured atmospheric parameter as input, but depends on latitude and day of the year was developed for the tropical region with emphasis on Indian sector.Large variability of atmospheric water vapor content in short spatial and/or temporal scales makes its measurement rather involved and expensive. A local network of GPS receivers is an effective tool for water vapor remote sensing over the land. This recently developed technique proves to be an effective tool for measuring PW. The potential of using GPS to estimate water vapor in the atmosphere at all-weather condition and with high temporal resolution is attempted. This will be useful for retrieving columnar water vapor from ground based GPS data. A good network of GPS could be a major source of water vapor information for Numerical Weather Prediction models and could act as surrogate to the data gap in microwave remote sensing for water vapor over land.
Resumo:
The growth potential of service sector, especially the aviation sector in the Indian economy is splendid. Therefore, it is crucial for the airline service providers to realize their customers, design offers and deliver the desired value to their customers. This study reveals the effect of airline passenger satisfactions particularly on re-buy intentions derived from the attributes-level performance dimensions of both service aspects and loyalty programme of an airline. The mediation effect of satisfaction and other selected antecedents on the re-buy intention of a passenger is hypothesized in this study. Critical areas affecting buying intentions such as core service quality and loyalty attribute-level performances, effect of frequent flyer programme and service quality satisfaction, passenger trust on airline, brand image and moderating effects of perceived value, frequent programme status and travel frequency of airline passengers are linked in a structural model to assess the strength of each facet in affecting re-buy intentions. Implications to the airlines were made based on the finding that re-buy intentions cannot be attributed solely to the impacts of frequent flyer programme, rather affected through the mediation effect of airline service quality satisfaction, which is very much valid for the higher FFP status category of frequent travelers. The effects of moderation caused by perceived value, FFP status and flying experience were also found to be significant in making re-buy intentions.
Resumo:
In this paper, we discuss Conceptual Knowledge Discovery in Databases (CKDD) in its connection with Data Analysis. Our approach is based on Formal Concept Analysis, a mathematical theory which has been developed and proven useful during the last 20 years. Formal Concept Analysis has led to a theory of conceptual information systems which has been applied by using the management system TOSCANA in a wide range of domains. In this paper, we use such an application in database marketing to demonstrate how methods and procedures of CKDD can be applied in Data Analysis. In particular, we show the interplay and integration of data mining and data analysis techniques based on Formal Concept Analysis. The main concern of this paper is to explain how the transition from data to knowledge can be supported by a TOSCANA system. To clarify the transition steps we discuss their correspondence to the five levels of knowledge representation established by R. Brachman and to the steps of empirically grounded theory building proposed by A. Strauss and J. Corbin.
Resumo:
Formal Concept Analysis is an unsupervised learning technique for conceptual clustering. We introduce the notion of iceberg concept lattices and show their use in Knowledge Discovery in Databases (KDD). Iceberg lattices are designed for analyzing very large databases. In particular they serve as a condensed representation of frequent patterns as known from association rule mining. In order to show the interplay between Formal Concept Analysis and association rule mining, we discuss the algorithm TITANIC. We show that iceberg concept lattices are a starting point for computing condensed sets of association rules without loss of information, and are a visualization method for the resulting rules.
Resumo:
Lasers play an important role for medical, sensoric and data storage devices. This thesis is focused on design, technology development, fabrication and characterization of hybrid ultraviolet Vertical-Cavity Surface-Emitting Lasers (UV VCSEL) with organic laser-active material and inorganic distributed Bragg reflectors (DBR). Multilayer structures with different layer thicknesses, refractive indices and absorption coefficients of the inorganic materials were studied using theoretical model calculations. During the simulations the structure parameters such as materials and thicknesses have been varied. This procedure was repeated several times during the design optimization process including also the feedback from technology and characterization. Two types of VCSEL devices were investigated. The first is an index coupled structure consisting of bottom and top DBR dielectric mirrors. In the space in between them is the cavity, which includes active region and defines the spectral gain profile. In this configuration the maximum electrical field is concentrated in the cavity and can destroy the chemical structure of the active material. The second type of laser is a so called complex coupled VCSEL. In this structure the active material is placed not only in the cavity but also in parts of the DBR structure. The simulations show that such a distribution of the active material reduces the required pumping power for reaching lasing threshold. High efficiency is achieved by substituting the dielectric material with high refractive index for the periods closer to the cavity. The inorganic materials for the DBR mirrors have been deposited by Plasma- Enhanced Chemical Vapor Deposition (PECVD) and Dual Ion Beam Sputtering (DIBS) machines. Extended optimizations of the technological processes have been performed. All the processes are carried out in a clean room Class 1 and Class 10000. The optical properties and the thicknesses of the layers are measured in-situ by spectroscopic ellipsometry and spectroscopic reflectometry. The surface roughness is analyzed by atomic force microscopy (AFM) and images of the devices are taken with scanning electron microscope (SEM). The silicon dioxide (SiO2) and silicon nitride (Si3N4) layers deposited by the PECVD machine show defects of the material structure and have higher absorption in the ultra violet range compared to ion beam deposition (IBD). This results in low reflectivity of the DBR mirrors and also reduces the optical properties of the VCSEL devices. However PECVD has the advantage that the stress in the layers can be tuned and compensated, in contrast to IBD at the moment. A sputtering machine Ionsys 1000 produced by Roth&Rau company, is used for the deposition of silicon dioxide (SiO2), silicon nitride (Si3N4), aluminum oxide (Al2O3) and zirconium dioxide (ZrO2). The chamber is equipped with main (sputter) and assisted ion sources. The dielectric materials were optimized by introducing additional oxygen and nitrogen into the chamber. DBR mirrors with different material combinations were deposited. The measured optical properties of the fabricated multilayer structures show an excellent agreement with the results of theoretical model calculations. The layers deposited by puttering show high compressive stress. As an active region a novel organic material with spiro-linked molecules is used. Two different materials have been evaporated by utilizing a dye evaporation machine in the clean room of the department Makromolekulare Chemie und Molekulare Materialien (mmCmm). The Spiro-Octopus-1 organic material has a maximum emission at the wavelength λemission = 395 nm and the Spiro-Pphenal has a maximum emission at the wavelength λemission = 418 nm. Both of them have high refractive index and can be combined with low refractive index materials like silicon dioxide (SiO2). The sputtering method shows excellent optical quality of the deposited materials and high reflection of the multilayer structures. The bottom DBR mirrors for all VCSEL devices were deposited by the DIBS machine, whereas the top DBR mirror deposited either by PECVD or by combination of PECVD and DIBS. The fabricated VCSEL structures were optically pumped by nitrogen laser at wavelength λpumping = 337 nm. The emission was measured by spectrometer. A radiation of the VCSEL structure at wavelength 392 nm and 420 nm is observed.
Resumo:
The furious pace of Moore's Law is driving computer architecture into a realm where the the speed of light is the dominant factor in system latencies. The number of clock cycles to span a chip are increasing, while the number of bits that can be accessed within a clock cycle is decreasing. Hence, it is becoming more difficult to hide latency. One alternative solution is to reduce latency by migrating threads and data, but the overhead of existing implementations has previously made migration an unserviceable solution so far. I present an architecture, implementation, and mechanisms that reduces the overhead of migration to the point where migration is a viable supplement to other latency hiding mechanisms, such as multithreading. The architecture is abstract, and presents programmers with a simple, uniform fine-grained multithreaded parallel programming model with implicit memory management. In other words, the spatial nature and implementation details (such as the number of processors) of a parallel machine are entirely hidden from the programmer. Compiler writers are encouraged to devise programming languages for the machine that guide a programmer to express their ideas in terms of objects, since objects exhibit an inherent physical locality of data and code. The machine implementation can then leverage this locality to automatically distribute data and threads across the physical machine by using a set of high performance migration mechanisms. An implementation of this architecture could migrate a null thread in 66 cycles -- over a factor of 1000 improvement over previous work. Performance also scales well; the time required to move a typical thread is only 4 to 5 times that of a null thread. Data migration performance is similar, and scales linearly with data block size. Since the performance of the migration mechanism is on par with that of an L2 cache, the implementation simulated in my work has no data caches and relies instead on multithreading and the migration mechanism to hide and reduce access latencies.
Resumo:
Speaker: Dr Kieron O'Hara Organiser: Time: 04/02/2015 11:00-11:45 Location: B32/3077 Abstract In order to reap the potential societal benefits of big and broad data, it is essential to share and link personal data. However, privacy and data protection considerations mean that, to be shared, personal data must be anonymised, so that the data subject cannot be identified from the data. Anonymisation is therefore a vital tool for data sharing, but deanonymisation, or reidentification, is always possible given sufficient auxiliary information (and as the amount of data grows, both in terms of creation, and in terms of availability in the public domain, the probability of finding such auxiliary information grows). This creates issues for the management of anonymisation, which are exacerbated not only by uncertainties about the future, but also by misunderstandings about the process(es) of anonymisation. This talk discusses these issues in relation to privacy, risk management and security, reports on recent theoretical tools created by the UKAN network of statistics professionals (on which the author is one of the leads), and asks how long anonymisation can remain a useful tool, and what might replace it.
Resumo:
The object of analysis in the present text is the issue of operational control and data retention in Poland. The analysis of this issue follows from a critical stance taken by NGOs and state institutions on the scope of operational control wielded by the Polish police and special services – it concerns, in particular, the employment of “itemized phone bills and the so-called phone tapping.” Besides the quantitative analysis of operational control and the scope of data retention, the text features the conclusions of the Human Rights Defender referred to the Constitutional Tribunal in 2011. It must be noted that the main problems concerned with the employment of operational control and data retention are caused by: (1) a lack of specification of technical means which can be used by individual services; (2) a lack of specification of what kind of information and evidence is in question; (3) an open catalogue of information and evidence which can be clandestinely acquired in an operational mode. Furthermore, with regard to the access granted to teleinformation data by the Telecommunications Act, attention should be drawn to a wide array of data submitted to particular services. Also, the text draws on the so-called open interviews conducted mainly with former police officers with a view to pointing to some non-formal reasons for “phone tapping” in Poland. This comes in the form of a summary.
Resumo:
We describe a new methodology for comparing satellite radiation budget data with a numerical weather prediction (NWP) model. This is applied to data from the Geostationary Earth Radiation Budget (GERB) instrument on Meteosat-8. The methodology brings together, in near-real time, GERB broadband shortwave and longwave fluxes with simulations based on analyses produced by the Met Office global NWP model. Results for the period May 2003 to February 2005 illustrate the progressive improvements in the data products as various initial problems were resolved. In most areas the comparisons reveal systematic errors in the model's representation of surface properties and clouds, which are discussed elsewhere. However, for clear-sky regions over the oceans the model simulations are believed to be sufficiently accurate to allow the quality of the GERB fluxes themselves to be assessed and any changes in time of the performance of the instrument to be identified. Using model and radiosonde profiles of temperature and humidity as input to a single-column version of the model's radiation code, we conduct sensitivity experiments which provide estimates of the expected model errors over the ocean of about ±5–10 W m−2 in clear-sky outgoing longwave radiation (OLR) and ±0.01 in clear-sky albedo. For the more recent data the differences between the observed and modeled OLR and albedo are well within these error estimates. The close agreement between the observed and modeled values, particularly for the most recent period, illustrates the value of the methodology. It also contributes to the validation of the GERB products and increases confidence in the quality of the data, prior to their release.
Resumo:
To examine the basis of emotional changes to the voice, physiological and electroglottal measures were combined with acoustic speech analysis of 30 men performing a computer task in which they lost or gained points under two levels of difficulty. Predictions of the main effects of difficulty and reward on the voice were not borne out by the data. Instead, vocal changes depended largely on interactions between gain versus loss and difficulty. The rate at which the vocal folds open and close (fundamental frequency; f0) was higher for loss than for gain when difficulty was high, but not when difficulty was low. Electroglottal measures revealed that f0 changes corresponded to shorter glottal open times for the loss conditions. Longer closed and shorter open phases were consistent with raised laryngeal tension in difficult loss conditions. Similarly, skin conductance indicated higher sympathetic arousal in loss than gain conditions, particularly when difficulty was high. The results provide evidence of the physiological basis of affective vocal responses, confirming the utility of measuring physiology and voice in the study of emotion.
Resumo:
Two models for predicting Septoria tritici on winter wheat (cv. Ri-band) were developed using a program based on an iterative search of correlations between disease severity and weather. Data from four consecutive cropping seasons (1993/94 until 1996/97) at nine sites throughout England were used. A qualitative model predicted the presence or absence of Septoria tritici (at a 5% severity threshold within the top three leaf layers) using winter temperature (January/February) and wind speed to about the first node detectable growth stage. For sites above the disease threshold, a quantitative model predicted severity of Septoria tritici using rainfall during stern elongation. A test statistic was derived to test the validity of the iterative search used to obtain both models. This statistic was used in combination with bootstrap analyses in which the search program was rerun using weather data from previous years, therefore uncorrelated with the disease data, to investigate how likely correlations such as the ones found in our models would have been in the absence of genuine relationships.