990 resultados para Addition techniques


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation research project addressed the question of how hydrologic restoration of the Everglades is impacting the nutrient dynamics of marsh ecosystems in the southern Everglades. These effects were analyzed by quantifying nitrogen (N) cycle dynamics in the region. I utilized stable isotope tracer techniques to investigate nitrogen uptake and cycling between the major ecosystem components of the freshwater marsh system. I recorded the natural isotopic signatures (δ15N and δ 13C) for major ecosystem components from the three major watersheds of the Everglades: Shark River Slough, Taylor Slough, and C-111 basin. Analysis of δ15 N and δ13C natural abundance data were used to demonstrate the spatial extent to which nitrogen from anthropogenic or naturally enriched sources is entering the marshes of the Everglades. In addition, I measured the fluxes on N between various ecosystem components at both near-canal and estuarine ecotone locations. Lastly, I investigated the effect of three phosphorus load treatments (0.00 mg P m-2, 6.66 mg P m-2, and 66.6 mg P m-2) on the rate and magnitude of ecosystem N-uptake and N-cycling. The δ15N and δ13C natural abundance data supported the hypothesis that ecosystem components from near-canal sites have heavier, more enriched δ 15N isotopic signatures than downstream sites. The natural abundance data also showed that the marshes of the southern Everglades are acting as a sink for isotopically heavier, canal-borne dissolved inorganic nitrogen (DIN) and a source for "new" marsh derived dissolved organic nitrogen (DON). In addition, the 15N mesocosm data showed the rapid assimilation of the 15N tracer by the periphyton component and the delayed N uptake by soil and macrophyte components in the southern Everglades.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The nation's freeway systems are becoming increasingly congested. A major contribution to traffic congestion on freeways is due to traffic incidents. Traffic incidents are non-recurring events such as accidents or stranded vehicles that cause a temporary roadway capacity reduction, and they can account for as much as 60 percent of all traffic congestion on freeways. One major freeway incident management strategy involves diverting traffic to avoid incident locations by relaying timely information through Intelligent Transportation Systems (ITS) devices such as dynamic message signs or real-time traveler information systems. The decision to divert traffic depends foremost on the expected duration of an incident, which is difficult to predict. In addition, the duration of an incident is affected by many contributing factors. Determining and understanding these factors can help the process of identifying and developing better strategies to reduce incident durations and alleviate traffic congestion. A number of research studies have attempted to develop models to predict incident durations, yet with limited success. ^ This dissertation research attempts to improve on this previous effort by applying data mining techniques to a comprehensive incident database maintained by the District 4 ITS Office of the Florida Department of Transportation (FDOT). Two categories of incident duration prediction models were developed: "offline" models designed for use in the performance evaluation of incident management programs, and "online" models for real-time prediction of incident duration to aid in the decision making of traffic diversion in the event of an ongoing incident. Multiple data mining analysis techniques were applied and evaluated in the research. The multiple linear regression analysis and decision tree based method were applied to develop the offline models, and the rule-based method and a tree algorithm called M5P were used to develop the online models. ^ The results show that the models in general can achieve high prediction accuracy within acceptable time intervals of the actual durations. The research also identifies some new contributing factors that have not been examined in past studies. As part of the research effort, software code was developed to implement the models in the existing software system of District 4 FDOT for actual applications. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, wireless network technology has grown at such a pace that scientific research has become a practical reality in a very short time span. Mobile wireless communications have witnessed the adoption of several generations, each of them complementing and improving the former. One mobile system that features high data rates and open network architecture is 4G. Currently, the research community and industry, in the field of wireless networks, are working on possible choices for solutions in the 4G system. 4G is a collection of technologies and standards that will allow a range of ubiquitous computing and wireless communication architectures. The researcher considers one of the most important characteristics of future 4G mobile systems the ability to guarantee reliable communications from 100 Mbps, in high mobility links, to as high as 1 Gbps for low mobility users, in addition to high efficiency in the spectrum usage. On mobile wireless communications networks, one important factor is the coverage of large geographical areas. In 4G systems, a hybrid satellite/terrestrial network is crucial to providing users with coverage wherever needed. Subscribers thus require a reliable satellite link to access their services when they are in remote locations, where a terrestrial infrastructure is unavailable. Thus, they must rely upon satellite coverage. Good modulation and access technique are also required in order to transmit high data rates over satellite links to mobile users. This technique must adapt to the characteristics of the satellite channel and also be efficient in the use of allocated bandwidth. Satellite links are fading channels, when used by mobile users. Some measures designed to approach these fading environments make use of: (1) spatial diversity (two receive antenna configuration); (2) time diversity (channel interleaver/spreading techniques); and (3) upper layer FEC. The author proposes the use of OFDM (Orthogonal Frequency Multiple Access) for the satellite link by increasing the time diversity. This technique will allow for an increase of the data rate, as primarily required by multimedia applications, and will also optimally use the available bandwidth. In addition, this dissertation approaches the use of Cooperative Satellite Communications for hybrid satellite/terrestrial networks. By using this technique, the satellite coverage can be extended to areas where there is no direct link to the satellite. For this purpose, a good channel model is necessary.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of the current study was to attempt to model various cognitive and social processes that are believed to lead to false confessions. More specifically, this study manipulated the variables of experimenter expectancy, guilt-innocence of the suspect, and interrogation techniques using the Russano et al. (2005) paradigm. The primary measure of interest was the likelihood of the participant signing the confession statement. By manipulating experimenter expectancy, the current study sought to further explore the social interactions that may occur in the interrogation room. In addition, in past experiments, the interrogator has typically been restricted to the use of one or two interrogation techniques. In the present study, interrogators were permitted to select from 15 different interrogation techniques when attempting to solicit a confession from participants. ^ Consistent with Rusanno et al. (2005), guilty participants (94%) were more likely to confess to the act of cheating than innocent participants (31%). The variable of experimenter expectancy did not effect confessions rates, length of interrogation, or the type of interrogation techniques used. Path analysis revealed feelings of pressure and the weighing of consequences on the part of the participant were associated with the signing of the confession statement. The findings suggest the guilt/innocence of the participant, the participant's perceptions of the interrogation situation, and length of interrogation play a pivotal role in the signing of the confession statement. Further examination of these variables may provide researchers with a better understanding of the relationship between interrogations and confessions. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The future power grid will effectively utilize renewable energy resources and distributed generation to respond to energy demand while incorporating information technology and communication infrastructure for their optimum operation. This dissertation contributes to the development of real-time techniques, for wide-area monitoring and secure real-time control and operation of hybrid power systems. ^ To handle the increased level of real-time data exchange, this dissertation develops a supervisory control and data acquisition (SCADA) system that is equipped with a state estimation scheme from the real-time data. This system is verified on a specially developed laboratory-based test bed facility, as a hardware and software platform, to emulate the actual scenarios of a real hybrid power system with the highest level of similarities and capabilities to practical utility systems. It includes phasor measurements at hundreds of measurement points on the system. These measurements were obtained from especially developed laboratory based Phasor Measurement Unit (PMU) that is utilized in addition to existing commercially based PMU’s. The developed PMU was used in conjunction with the interconnected system along with the commercial PMU’s. The tested studies included a new technique for detecting the partially islanded micro grids in addition to several real-time techniques for synchronization and parameter identifications of hybrid systems. ^ Moreover, due to numerous integration of renewable energy resources through DC microgrids, this dissertation performs several practical cases for improvement of interoperability of such systems. Moreover, increased number of small and dispersed generating stations and their need to connect fast and properly into the AC grids, urged this work to explore the challenges that arise in synchronization of generators to the grid and through introduction of a Dynamic Brake system to improve the process of connecting distributed generators to the power grid.^ Real time operation and control requires data communication security. A research effort in this dissertation was developed based on Trusted Sensing Base (TSB) process for data communication security. The innovative TSB approach improves the security aspect of the power grid as a cyber-physical system. It is based on available GPS synchronization technology and provides protection against confidentiality attacks in critical power system infrastructures. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, wireless network technology has grown at such a pace that scientific research has become a practical reality in a very short time span. One mobile system that features high data rates and open network architecture is 4G. Currently, the research community and industry, in the field of wireless networks, are working on possible choices for solutions in the 4G system. The researcher considers one of the most important characteristics of future 4G mobile systems the ability to guarantee reliable communications at high data rates, in addition to high efficiency in the spectrum usage. On mobile wireless communication networks, one important factor is the coverage of large geographical areas. In 4G systems, a hybrid satellite/terrestrial network is crucial to providing users with coverage wherever needed. Subscribers thus require a reliable satellite link to access their services when they are in remote locations where a terrestrial infrastructure is unavailable. The results show that good modulation and access technique are also required in order to transmit high data rates over satellite links to mobile users. The dissertation proposes the use of OFDM (Orthogonal Frequency Multiple Access) for the satellite link by increasing the time diversity. This technique will allow for an increase of the data rate, as primarily required by multimedia applications, and will also optimally use the available bandwidth. In addition, this dissertation approaches the use of Cooperative Satellite Communications for hybrid satellite/terrestrial networks. By using this technique, the satellite coverage can be extended to areas where there is no direct link to the satellite. The issue of Cooperative Satellite Communications is solved through a new algorithm that forwards the received data from the fixed node to the mobile node. This algorithm is very efficient because it does not allow unnecessary transmissions and is based on signal to noise ratio (SNR) measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Unequaled improvements in processor and I/O speeds make many applications such as databases and operating systems to be increasingly I/O bound. Many schemes such as disk caching and disk mirroring have been proposed to address the problem. In this thesis we focus only on disk mirroring. In disk mirroring, a logical disk image is maintained on two physical disks allowing a single disk failure to be transparent to application programs. Although disk mirroring improves data availability and reliability, it has two major drawbacks. First, writes are expensive because both disks must be updated. Second, load balancing during failure mode operation is poor because all requests are serviced by the surviving disk. Distorted mirrors was proposed to address the write problem and interleaved declustering to address the load balancing problem. In this thesis we perform a comparative study of these two schemes under various operating modes. In addition we also study traditional mirroring to provide a common basis for comparison.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Atomisation of an aqueous solution for tablet film coating is a complex process with multiple factors determining droplet formation and properties. The importance of droplet size for an efficient process and a high quality final product has been noted in the literature, with smaller droplets reported to produce smoother, more homogenous coatings whilst simultaneously avoiding the risk of damage through over-wetting of the tablet core. In this work the effect of droplet size on tablet film coat characteristics was investigated using X-ray microcomputed tomography (XμCT) and confocal laser scanning microscopy (CLSM). A quality by design approach utilising design of experiments (DOE) was used to optimise the conditions necessary for production of droplets at a small (20 μm) and large (70 μm) droplet size. Droplet size distribution was measured using real-time laser diffraction and the volume median diameter taken as a response. DOE yielded information on the relationship three critical process parameters: pump rate, atomisation pressure and coating-polymer concentration, had upon droplet size. The model generated was robust, scoring highly for model fit (R2 = 0.977), predictability (Q2 = 0.837), validity and reproducibility. Modelling confirmed that all parameters had either a linear or quadratic effect on droplet size and revealed an interaction between pump rate and atomisation pressure. Fluidised bed coating of tablet cores was performed with either small or large droplets followed by CLSM and XμCT imaging. Addition of commonly used contrast materials to the coating solution improved visualisation of the coating by XμCT, showing the coat as a discrete section of the overall tablet. Imaging provided qualitative and quantitative evidence revealing that smaller droplets formed thinner, more uniform and less porous film coats.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The control of radioactive backgrounds will be key in the search for neutrinoless double beta decay at the SNO+ experiment. Several aspects of the SNO+ back- grounds have been studied. The SNO+ tellurium purification process may require ultra low background ethanol as a reagent. A low background assay technique for ethanol was developed and used to identify a source of ethanol with measured 238U and 232Th concentrations below 2.8 10^-13 g/g and 10^-14 g/g respectively. It was also determined that at least 99:997% of the ethanol can be removed from the purified tellurium using forced air ow in order to reduce 14C contamination. In addition, a quality-control technique using an oxygen sensor was studied to monitor 222Rn contamination due to air leaking into the SNO+ scintillator during transport. The expected sensitivity of the technique is 0.1mBq/L or better depending on the oxygen sensor used. Finally, the dependence of SNO+ neutrinoless double beta decay sensitivity on internal background levels was studied using Monte Carlo simulation. The half-life limit to neutrinoless double beta decay of 130Te after 3 years of operation was found to be 4.8 1025 years under default conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The section of CN railway between Vancouver and Kamloops runs along the base of many hazardous slopes, including the White Canyon, which is located just outside the town of Lytton, BC. The slope has a history of frequent rockfall activity, which presents a hazard to the railway below. Rockfall inventories can be used to understand the frequency-magnitude relationship of events on hazardous slopes, however it can be difficult to consistently and accurately identify rockfall source zones and volumes on large slopes with frequent activity, leaving many inventories incomplete. We have studied this slope as a part of the Canadian Railway Ground Hazard Research Program and have collected remote sensing data, including terrestrial laser scanning (TLS), photographs, and photogrammetry data since 2012, and used change detection to identify rockfalls on the slope. The objective of this thesis is to use a subset of this data to understand how rockfalls identified from TLS data could be used to understand the frequency-magnitude relationship of rockfalls on the slope. This includes incorporating both new and existing methods to develop a semi-automated workflow to extract rockfall events from the TLS data. We show that these methods can be used to identify events as small as 0.01 m3 and that the duration between scans can have an effect on the frequency-magnitude relationship of the rockfalls. We also show that by incorporating photogrammetry data into our analysis, we can create a 3D geological model of the slope and use this to classify rockfalls by lithology, to further understand the rockfall failure patterns. When relating the rockfall activity to triggering factors, we found that the amount of precipitation occurring over the winter has an effect on the overall rockfall frequency for the remainder of the year. These results can provide the railways with a more complete inventory of events compared to records created through track inspection, or rockfall monitoring systems that are installed on the slope. In addition, we can use the database to understand the spatial and temporal distribution of events. The results can also be used as an input to rockfall modelling programs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The latest techniques for the fabrication of high power laser targets, using processes developed for the manufacture of Micro-Electro-Mechanical System (MEMS) devices are discussed. These laser targets are designed to meet the needs of the increased shot numbers that are available in the latest design of laser facilities. Traditionally laser targets have been fabricated using conventional machining or coarse etching processes and have been produced in quantities of 10s to low 100s. Such targets can be used for high complexity experiments such as Inertial Fusion Energy (IFE) studies and can have many complex components that need assembling and characterisation with high precision. Using the techniques that are common to MEMS devices and integrating these with an existing target fabrication capability we are able to manufacture and deliver targets to these systems. It also enables us to manufacture novel targets that have not been possible using other techniques. In addition, developments in the positioning systems that are required to deliver these targets to the laser focus are also required and a system to deliver the target to a focus of an F2 beam at 0.1Hz is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conventional taught learning practices often experience difficulties in keeping students motivated and engaged. Video games, however, are very successful at sustaining high levels of motivation and engagement through a set of tasks for hours without apparent loss of focus. In addition, gamers solve complex problems within a gaming environment without feeling fatigue or frustration, as they would typically do with a comparable learning task. Based on this notion, the academic community is keen on exploring methods that can deliver deep learner engagement and has shown increased interest in adopting gamification – the integration of gaming elements, mechanics, and frameworks into non-game situations and scenarios – as a means to increase student engagement and improve information retention. Its effectiveness when applied to education has been debatable though, as attempts have generally been restricted to one-dimensional approaches such as transposing a trivial reward system onto existing teaching materials and/or assessments. Nevertheless, a gamified, multi-dimensional, problem-based learning approach can yield improved results even when applied to a very complex and traditionally dry task like the teaching of computer programming, as shown in this paper. The presented quasi-experimental study used a combination of instructor feedback, real time sequence of scored quizzes, and live coding to deliver a fully interactive learning experience. More specifically, the “Kahoot!” Classroom Response System (CRS), the classroom version of the TV game show “Who Wants To Be A Millionaire?”, and Codecademy’s interactive platform formed the basis for a learning model which was applied to an entry-level Python programming course. Students were thus allowed to experience multiple interlocking methods similar to those commonly found in a top quality game experience. To assess gamification’s impact on learning, empirical data from the gamified group were compared to those from a control group who was taught through a traditional learning approach, similar to the one which had been used during previous cohorts. Despite this being a relatively small-scale study, the results and findings for a number of key metrics, including attendance, downloading of course material, and final grades, were encouraging and proved that the gamified approach was motivating and enriching for both students and instructors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There appears to be a limited but growing body of research on the sequential analysis/treatment of multiple types of evidence. The development of an integrated forensic approach is necessary to maximise evidence recovery and to ensure that a particular treatment is not detrimental to other types of evidence. This study aims to assess the effect of latent and blood mark enhancement techniques (e.g. fluorescence, ninhydrin, acid violet 17, black iron-oxide powder suspension) on the subsequent detection of saliva. Saliva detection was performed by means of a presumptive test (Phadebas®) in addition to analysis by a rapid stain identification (RSID) kit test and confirmatory DNA testing. Additional variables included a saliva depletion series and a number of different substrates with varying porosities as well as different ageing periods. Examination and photography under white light and fluorescence was carried out prior to and after chemical enhancement All enhancement techniques (except Bluestar® Forensic Magnum luminol) employed in this study resulted in an improved visualisation of the saliva stains, although the inherent fluorescence of saliva was sometimes blocked after chemical treatment. The use of protein stains was, in general, detrimental to the detection of saliva. Positive results were less pronounced after the use of black iron-oxide powder suspension, cyanoacrylate fuming followed by BY40 and ninhydrin when compared to the respective positive controls. The application of Bluestar® Forensic Magnum luminol and black magnetic powder proved to be the least detrimental, with no significant difference between the test results and the positive controls. The use of non-destructive fluorescence examination provided good visualisation; however, only the first few marks in the depletion were observed. Of the samples selected for DNA analysis only depletion 1 samples contained sufficient DNA quantity for further processing using standard methodology. The 28 day delay between sample deposition and collection resulted in a 5-fold reduction in the amount of useable DNA. When sufficient DNA quantities were recovered, enhancement techniques did not have a detrimental effect on the ability to generate DNA profiles. This study aims to contribute to a strategy for maximising evidence recovery and efficiency for the detection of latent marks and saliva. The results demonstrate that most of the enhancement techniques employed in this study were not detrimental to the subsequent detection of saliva by means of presumptive, confirmative and DNA tests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software engineering best practices allow significantly improving the software development. However, the implementation of best practices requires skilled professionals, financial investment and technical support to facilitate implementation and achieve the respective improvement. In this paper we proposes a protocol to design techniques to implement best practices of software engineering. The protocol includes the identification and selection of process to improve, the study of standards and models, identification of best practices associated with the process and the possible implementation techniques. In addition, technical design activities are defined in order to create or adapt the techniques of implementing best practices for software development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis provides a comprehensive analysis of the characterisation of two of the major figures in the Aeneid, Aeneas and Turnus. Particular attention is paid to their direct speeches, all of which are examined and, where relevant, compared to Homeric models and parallels. To this purpose considerable use is made of the indices in Knauer's Die Aeneis und Homer. A more general comparison is made between the dramatic (direct speech) role of Aeneas and those of Homer's Achilles (Iliad) and Odysseus (Odyssey). An appraisal is made (from the viewpoint of depiction of character) of the relationship between the direct and indirect speeches in the Aeneid. Reasons are given to suggest that it is not mere chance, or for the sake of variety, that certain speeches of Aeneas and Turnus are expressed in oratio obliqua. In addition, the narrative portrayal of Aeneas and Turnus is considered in apposition to that of the speeches. A distinction is drawn between Vergil's direct method of characterisation (direct speeches) and his indirect methods (narrative/oratio obliqua). Inevitably, the analysis involves major consideration of the Roman values which pervade the work. All speeches, thoughts and actions of Aeneas and Turnus are assessed in terms of pietas, impietas, furor, virtus, ratio, clementia, humanitas (etc.). It is shown that individual concepts (such as pietas and impietas) are reflected in Vergil's direct and indirect methods of characterisation. The workings of fate and their relevance to the pietas concept are discussed throughout.