979 resultados para Ground analysis
Resumo:
Research councils, agencies, and researchers recognize the benefits of team-based health research. However, researchers involved in large-scale team-based research projects face multiple challenges as they seek to identify epistemological and ontological common ground. Typically, these challenges occur between quantitative and qualitative researchers but can occur between qualitative researchers, particularly when the project involves multiple disciplinary perspectives. The authors use the convergent interviewing technique in their multidisciplinary research project to overcome these challenges. This technique assists them in developing common epistemological and ontological ground while enabling swift and detailed data collection and analysis. Although convergent interviewing is a relatively new method described primarily in marketing research, it compares and contrasts well with grounded theory and other techniques. The authors argue that this process provides a rigorous method to structure and refine research projects and requires researchers to identify and be accountable for developing a common epistemological and ontological position.
Resumo:
The development of scramjet propulsion for alternative launch and payload delivery capabilities has been composed largely of ground experiments for the last 40 years. With the goal of validating the use of short duration ground test facilities, a ballistic reentry vehicle experiment called HyShot was devised to achieve supersonic combustion in flight above Mach 7.5. It consisted of a double wedge intake and two back-to-back constant area combustors; one supplied with hydrogen fuel at an equivalence ratio of 0.34 and the other unfueled. Of the two flights conducted, HyShot 1 failed to reach the desired altitude due to booster failure, whereas HyShot 2 successfully accomplished both the desired trajectory and satisfactory scramjet operation. Postflight data analysis of HyShot 2 confirmed the presence of supersonic combustion during the approximately 3 s test window at altitudes between 35 and 29 km. Reasonable correlation between flight and some preflight shock tunnel tests was observed.
Resumo:
This chapter serves three very important functions within this collection. First, it aims to make the existence of FPDA better known to both gender and language researchers and to the wider community of discourse analysts, by outlining FPDA’s own theoretical and methodological approaches. This involves locating and positioning FPDA in relation, yet in contradistinction to, the fields of discourse analysis to which it is most often compared: Critical Discourse Analysis (CDA) and, to a lesser extent, Conversation Analysis (CA). Secondly, the chapter serves a vital symbolic function. It aims to contest the authority of the more established theoretical and methodological approaches represented in this collection, which currently dominate the field of discourse analysis. FPDA considers that an established field like gender and language study will only thrive and develop if it is receptive to new ways of thinking, divergent methods of study, and approaches that question and contest received wisdoms or established methods. Thirdly, the chapter aims to introduce some new, experimental and ground-breaking FPDA work, including that by Harold Castañeda-Peña and Laurel Kamada (same volume). I indicate the different ways in which a number of young scholars are imaginatively developing the possibilities of an FPDA approach to their specific gender and language projects.
Resumo:
A dry matrix application for matrix-assisted laser desorption/ionization mass spectrometry imaging (MALDI MSI) was used to profile the distribution of 4-bromophenyl-1,4-diazabicyclo(3.2.2)nonane-4-carboxylate, monohydrochloride (BDNC, SSR180711) in rat brain tissue sections. Matrix application involved applying layers of finely ground dry alpha-cyano-4-hydroxycinnamic acid (CHCA) to the surface of tissue sections thaw mounted onto MALDI targets. It was not possible to detect the drug when applying matrix in a standard aqueous-organic solvent solution. The drug was detected at higher concentrations in specific regions of the brain, particularly the white matter of the cerebellum. Pseudomultiple reaction monitoring imaging was used to validate that the observed distribution was the target compound. The semiquantitative data obtained from signal intensities in the imaging was confirmed by laser microdissection of specific regions of the brain directed by the imaging, followed by hydrophilic interaction chromatography in combination with a quantitative high-resolution mass spectrometry method. This study illustrates that a dry matrix coating is a valuable and complementary matrix application method for analysis of small polar drugs and metabolites that can be used for semiquantitative analysis.
Resumo:
The thesis examines Kuhn's (1962, 1970) concept of paradigm, assesses how it is employed for mapping intellectual terrain in the social sciences, and evaluates it's use in research based on multiple theory positions. In so doing it rejects both the theses of total paradigm 'incommensurability' (Kuhn, 1962), and also of liberal 'translation' (Popper, 1970), in favour of a middle ground through the 'language-game of everyday life' (Wittgenstein, 1953). The thesis ultimately argues for the possibility of being 'trained-into' new paradigms, given the premise that 'unorganised experience cannot order perception' (Phillips, 1977). In conducting multiple paradigm research the analysis uses the Burrell and Morgan (1979) model for examining the work organisation of a large provincial fire Service. This analysis accounts for firstly, a 'functionalist' assessment of work design, demonstrating inter alia the decrease in reported motivation with length of service; secondly, an 'interpretive' portrayal of the daily accomplishment of task routines, highlighting the discretionary and negotiated nature of the day's events; thirdly, a 'radical humanist' analysis of workplace ideology, demonstrating the hegemonic role of officer training practices; and finally, a 'radical structuralist' description of the labour process, focusing on the establishment of a 'normal working day'. Although the argument is made for the possibility of conducting multiple paradigm research, the conclusion stresses the many institutional pressures serving to offset development.
Resumo:
A key feature of ‘TESOL Quarterly’, a leading journal in the world of TESOL/applied linguistics, is its ‘Forum’ section which invites ‘responses and rebuttals’ from readers to any of its articles. These ‘responses or rebuttals’ form the focus of this research. In the interchanges between readers reacting to earlier research articles in TESOL Quarterly and authors responding to the said reaction I – examine the texts for evidence of genre-driven structure, whether shared between both ‘reaction’ and ‘response’ sections, or peculiar to each section, and attempt to determine the precise nature of the intended communicative purpose in particular and the implications for academic debate in general. The intended contribution of this thesis is to provide an analysis of how authors of research articles and their critics pursue their efforts beyond the research article which precipitated these exchanges in order to be recognized by their discourse community as, in the terminology of Swales (1981:51), ‘Primary Knowers’. Awareness of any principled generic process identified in this thesis may be of significance to practitioners in the applied linguistics community in their quest to establish academic reputation and in their pursuit of professional development. These findings may also be of use in triggering productive community discussion as a result of the questions they raise concerning the present nature of academic debate. Looking beyond the construction and status of the texts themselves, I inquire into the kind of ideational and social organization such exchanges keep in place and examine an alternative view of interaction. This study breaks new ground in two major ways. To the best of my knowledge, it is the first exploration of a bipartite, intertextual structure laying claim to genre status. Secondly, in its recourse to the comments of the writers’ themselves rather than relying exclusively on the evidence of their texts, as is the case with most studies of genre, this thesis offers an expanded opportunity to discuss perhaps the most interesting aspects of genre analysis – the light it throws on social ends and the role of genre in determining the nature of current academic debate as it here emerges.
Resumo:
The field of free radical biology and medicine continues to move at a tremendous pace, with a constant flow of ground-breaking discoveries. The following collection of papers in this issue of Biochemical Society Transactions highlights several key areas of topical interest, including the crucial role of validated measurements of radicals and reactive oxygen species in underpinning nearly all research in the field, the important advances being made as a result of the overlap of free radical research with the reinvigorated field of lipidomics (driven in part by innovations in MS-based analysis), the acceleration of new insights into the role of oxidative protein modifications (particularly to cysteine residues) in modulating cell signalling, and the effects of free radicals on the functions of mitochondria, extracellular matrix and the immune system. In the present article, we provide a brief overview of these research areas, but, throughout this discussion, it must be remembered that it is the availability of reliable analytical methodologies that will be a key factor in facilitating continuing developments in this exciting research area.
Resumo:
The application of high-power voltage-source converters (VSCs) to multiterminal dc networks is attracting research interest. The development of VSC-based dc networks is constrained by the lack of operational experience, the immaturity of appropriate protective devices, and the lack of appropriate fault analysis techniques. VSCs are vulnerable to dc-cable short-circuit and ground faults due to the high discharge current from the dc-link capacitance. However, faults occurring along the interconnecting dc cables are most likely to threaten system operation. In this paper, cable faults in VSC-based dc networks are analyzed in detail with the identification and definition of the most serious stages of the fault that need to be avoided. A fault location method is proposed because this is a prerequisite for an effective design of a fault protection scheme. It is demonstrated that it is relatively easy to evaluate the distance to a short-circuit fault using voltage reference comparison. For the more difficult challenge of locating ground faults, a method of estimating both the ground resistance and the distance to the fault is proposed by analyzing the initial stage of the fault transient. Analysis of the proposed method is provided and is based on simulation results, with a range of fault resistances, distances, and operational conditions considered.
Resumo:
Summarizing the accumulated experience for a long time in the polyparametric cognitive modeling of different physiological processes (electrocardiogram, electroencephalogram, electroreovasogram and others) and the development on this basis some diagnostics methods give ground for formulating a new methodology of the system analysis in biology. The gist of the methodology consists of parametrization of fractals of electrophysiological processes, matrix description of functional state of an object with a unified set of parameters, construction of the polyparametric cognitive geometric model with artificial intelligence algorithms. The geometry model enables to display the parameter relationships are adequate to requirements of the system approach. The objective character of the elements of the models and high degree of formalization which facilitate the use of the mathematical methods are advantages of these models. At the same time the geometric images are easily interpreted in physiological and clinical terms. The polyparametric modeling is an object oriented tool possessed advances functional facilities and some principal features.
Resumo:
Subtitle D of the Resource Conservation and Recovery Act (RCRA) requires a post closure period of 30 years for non hazardous wastes in landfills. Post closure care (PCC) activities under Subtitle D include leachate collection and treatment, groundwater monitoring, inspection and maintenance of the final cover, and monitoring to ensure that landfill gas does not migrate off site or into on site buildings. The decision to reduce PCC duration requires exploration of a performance based methodology to Florida landfills. PCC should be based on whether the landfill is a threat to human health or the environment. Historically no risk based procedure has been available to establish an early end to PCC. Landfill stability depends on a number of factors that include variables that relate to operations both before and after the closure of a landfill cell. Therefore, PCC decisions should be based on location specific factors, operational factors, design factors, post closure performance, end use, and risk analysis. The question of appropriate PCC period for Florida’s landfills requires in depth case studies focusing on the analysis of the performance data from closed landfills in Florida. Based on data availability, Davie Landfill was identified as case study site for a case by case analysis of landfill stability. The performance based PCC decision system developed by Geosyntec Consultants was used for the assessment of site conditions to project PCC needs. The available data for leachate and gas quantity and quality, ground water quality, and cap conditions were evaluated. The quality and quantity data for leachate and gas were analyzed to project the levels of pollutants in leachate and groundwater in reference to maximum contaminant level (MCL). In addition, the projected amount of gas quantity was estimated. A set of contaminants (including metals and organics) were identified as contaminants detected in groundwater for health risk assessment. These contaminants were selected based on their detection frequency and levels in leachate and ground water; and their historical and projected trends. During the evaluations a range of discrepancies and problems that related to the collection and documentation were encountered and possible solutions made. Based on the results of PCC performance integrated with risk assessment, projection of future PCC monitoring needs and sustainable waste management options were identified. According to these results, landfill gas monitoring can be terminated, leachate and groundwater monitoring for parameters above MCL and surveying of the cap integrity should be continued. The parameters which cause longer monitoring periods can be eliminated for the future sustainable landfills. As a conclusion, 30 year PCC period can be reduced for some of the landfill components based on their potential impacts to human health and environment (HH&E).
Resumo:
With the developments in computing and communication technologies, wireless sensor networks have become popular in wide range of application areas such as health, military, environment and habitant monitoring. Moreover, wireless acoustic sensor networks have been widely used for target tracking applications due to their passive nature, reliability and low cost. Traditionally, acoustic sensor arrays built in linear, circular or other regular shapes are used for tracking acoustic sources. The maintaining of relative geometry of the acoustic sensors in the array is vital for accurate target tracking, which greatly reduces the flexibility of the sensor network. To overcome this limitation, we propose using only a single acoustic sensor at each sensor node. This design greatly improves the flexibility of the sensor network and makes it possible to deploy the sensor network in remote or hostile regions through air-drop or other stealth approaches. Acoustic arrays are capable of performing the target localization or generating the bearing estimations on their own. However, with only a single acoustic sensor, the sensor nodes will not be able to generate such measurements. Thus, self-organization of sensor nodes into virtual arrays to perform the target localization is essential. We developed an energy-efficient and distributed self-organization algorithm for target tracking using wireless acoustic sensor networks. The major error sources of the localization process were studied, and an energy-aware node selection criterion was developed to minimize the target localization errors. Using this node selection criterion, the self-organization algorithm selects a near-optimal localization sensor group to minimize the target tracking errors. In addition, a message passing protocol was developed to implement the self-organization algorithm in a distributed manner. In order to achieve extended sensor network lifetime, energy conservation was incorporated into the self-organization algorithm by incorporating a sleep-wakeup management mechanism with a novel cross layer adaptive wakeup probability adjustment scheme. The simulation results confirm that the developed self-organization algorithm provides satisfactory target tracking performance. Moreover, the energy saving analysis confirms the effectiveness of the cross layer power management scheme in achieving extended sensor network lifetime without degrading the target tracking performance.
Resumo:
With the developments in computing and communication technologies, wireless sensor networks have become popular in wide range of application areas such as health, military, environment and habitant monitoring. Moreover, wireless acoustic sensor networks have been widely used for target tracking applications due to their passive nature, reliability and low cost. Traditionally, acoustic sensor arrays built in linear, circular or other regular shapes are used for tracking acoustic sources. The maintaining of relative geometry of the acoustic sensors in the array is vital for accurate target tracking, which greatly reduces the flexibility of the sensor network. To overcome this limitation, we propose using only a single acoustic sensor at each sensor node. This design greatly improves the flexibility of the sensor network and makes it possible to deploy the sensor network in remote or hostile regions through air-drop or other stealth approaches. Acoustic arrays are capable of performing the target localization or generating the bearing estimations on their own. However, with only a single acoustic sensor, the sensor nodes will not be able to generate such measurements. Thus, self-organization of sensor nodes into virtual arrays to perform the target localization is essential. We developed an energy-efficient and distributed self-organization algorithm for target tracking using wireless acoustic sensor networks. The major error sources of the localization process were studied, and an energy-aware node selection criterion was developed to minimize the target localization errors. Using this node selection criterion, the self-organization algorithm selects a near-optimal localization sensor group to minimize the target tracking errors. In addition, a message passing protocol was developed to implement the self-organization algorithm in a distributed manner. In order to achieve extended sensor network lifetime, energy conservation was incorporated into the self-organization algorithm by incorporating a sleep-wakeup management mechanism with a novel cross layer adaptive wakeup probability adjustment scheme. The simulation results confirm that the developed self-organization algorithm provides satisfactory target tracking performance. Moreover, the energy saving analysis confirms the effectiveness of the cross layer power management scheme in achieving extended sensor network lifetime without degrading the target tracking performance.
Resumo:
Subtitle D of the Resource Conservation and Recovery Act (RCRA) requires a post closure period of 30 years for non hazardous wastes in landfills. Post closure care (PCC) activities under Subtitle D include leachate collection and treatment, groundwater monitoring, inspection and maintenance of the final cover, and monitoring to ensure that landfill gas does not migrate off site or into on site buildings. The decision to reduce PCC duration requires exploration of a performance based methodology to Florida landfills. PCC should be based on whether the landfill is a threat to human health or the environment. Historically no risk based procedure has been available to establish an early end to PCC. Landfill stability depends on a number of factors that include variables that relate to operations both before and after the closure of a landfill cell. Therefore, PCC decisions should be based on location specific factors, operational factors, design factors, post closure performance, end use, and risk analysis. The question of appropriate PCC period for Florida’s landfills requires in depth case studies focusing on the analysis of the performance data from closed landfills in Florida. Based on data availability, Davie Landfill was identified as case study site for a case by case analysis of landfill stability. The performance based PCC decision system developed by Geosyntec Consultants was used for the assessment of site conditions to project PCC needs. The available data for leachate and gas quantity and quality, ground water quality, and cap conditions were evaluated. The quality and quantity data for leachate and gas were analyzed to project the levels of pollutants in leachate and groundwater in reference to maximum contaminant level (MCL). In addition, the projected amount of gas quantity was estimated. A set of contaminants (including metals and organics) were identified as contaminants detected in groundwater for health risk assessment. These contaminants were selected based on their detection frequency and levels in leachate and ground water; and their historical and projected trends. During the evaluations a range of discrepancies and problems that related to the collection and documentation were encountered and possible solutions made. Based on the results of PCC performance integrated with risk assessment, projection of future PCC monitoring needs and sustainable waste management options were identified. According to these results, landfill gas monitoring can be terminated, leachate and groundwater monitoring for parameters above MCL and surveying of the cap integrity should be continued. The parameters which cause longer monitoring periods can be eliminated for the future sustainable landfills. As a conclusion, 30 year PCC period can be reduced for some of the landfill components based on their potential impacts to human health and environment (HH&E).
Resumo:
Drillhole-determined sea-ice thickness was compared with values derived remotely using a portable small-offset loop-loop steady state electromagnetic (EM) induction device during expeditions to Fram Strait and the Siberian Arctic, under typical winter and summer conditions. Simple empirical transformation equations are derived to convert measured apparent conductivity into ice thickness. Despite the extreme seasonal differences in sea-ice properties as revealed by ice core analysis, the transformation equations vary little for winter and summer. Thus, the EM induction technique operated on the ice surface in the horizontal dipole mode yields accurate results within 5 to 10% of the drillhole determined thickness over level ice in both seasons. The robustness of the induction method with respect to seasonal extremes is attributed to the low salinity of brine or meltwater filling the extensive pore space in summer. Thus, the average bulk ice conductivity for summer multiyear sea ice derived according to Archie's law amounts to 23 mS/m compared to 3 mS/m for winter conditions. These mean conductivities cause only minor differences in the EM response, as is shown by means of 1-D modeling. However, under summer conditions the range of ice conductivities is wider. Along with the widespread occurrence of surface melt ponds and freshwater lenses underneath the ice, this causes greater scatter in the apparent conductivity/ice thickness relation. This can result in higher deviations between EM-derived and drillhole determined thicknesses in summer than in winter.
Resumo:
The main focus of this thesis is to address the relative localization problem of a heterogenous team which comprises of both ground and micro aerial vehicle robots. This team configuration allows to combine the advantages of increased accessibility and better perspective provided by aerial robots with the higher computational and sensory resources provided by the ground agents, to realize a cooperative multi robotic system suitable for hostile autonomous missions. However, in such a scenario, the strict constraints in flight time, sensor pay load, and computational capability of micro aerial vehicles limits the practical applicability of popular map-based localization schemes for GPS denied navigation. Therefore, the resource limited aerial platforms of this team demand simpler localization means for autonomous navigation. Relative localization is the process of estimating the formation of a robot team using the acquired inter-robot relative measurements. This allows the team members to know their relative formation even without a global localization reference, such as GPS or a map. Thus a typical robot team would benefit from a relative localization service since it would allow the team to implement formation control, collision avoidance, and supervisory control tasks, independent of a global localization service. More importantly, a heterogenous team such as ground robots and computationally constrained aerial vehicles would benefit from a relative localization service since it provides the crucial localization information required for autonomous operation of the weaker agents. This enables less capable robots to assume supportive roles and contribute to the more powerful robots executing the mission. Hence this study proposes a relative localization-based approach for ground and micro aerial vehicle cooperation, and develops inter-robot measurement, filtering, and distributed computing modules, necessary to realize the system. The research study results in three significant contributions. First, the work designs and validates a novel inter-robot relative measurement hardware solution which has accuracy, range, and scalability characteristics, necessary for relative localization. Second, the research work performs an analysis and design of a novel nonlinear filtering method, which allows the implementation of relative localization modules and attitude reference filters on low cost devices with optimal tuning parameters. Third, this work designs and validates a novel distributed relative localization approach, which harnesses the distributed computing capability of the team to minimize communication requirements, achieve consistent estimation, and enable efficient data correspondence within the network. The work validates the complete relative localization-based system through multiple indoor experiments and numerical simulations. The relative localization based navigation concept with its sensing, filtering, and distributed computing methods introduced in this thesis complements system limitations of a ground and micro aerial vehicle team, and also targets hostile environmental conditions. Thus the work constitutes an essential step towards realizing autonomous navigation of heterogenous teams in real world applications.