5 resultados para Subfractals, Subfractal Coding, Model Analysis, Digital Imaging, Pattern Recognition
em Digital Commons - Michigan Tech
Resumo:
Magmatic volatiles play a crucial role in volcanism, from magma production at depth to generation of seismic phenomena to control of eruption style. Accordingly, many models of volcano dynamics rely heavily on behavior of such volatiles. Yet measurements of emission rates of volcanic gases have historically been limited, which has restricted model verification to processes on the order of days or longer. UV cameras are a recent advancement in the field of remote sensing of volcanic SO2 emissions. They offer enhanced temporal and spatial resolution over previous measurement techniques, but need development before they can be widely adopted and achieve the promise of integration with other geophysical datasets. Large datasets require a means by which to quickly and efficiently use imagery to calculate emission rates. We present a suite of programs designed to semi-automatically determine emission rates of SO2 from series of UV images. Extraction of high temporal resolution SO2 emission rates via this software facilitates comparison of gas data to geophysical data for the purposes of evaluating models of volcanic activity and has already proven useful at several volcanoes. Integrated UV camera and seismic measurements recorded in January 2009 at Fuego volcano, Guatemala, provide new insight into the system’s shallow conduit processes. High temporal resolution SO2 data reveal patterns of SO2 emission rate relative to explosions and seismic tremor that indicate tremor and degassing share a common source process. Progressive decreases in emission rate appear to represent inhibition of gas loss from magma as a result of rheological stiffening in the upper conduit. Measurements of emission rate from two closely-spaced vents, made possible by the high spatial resolution of the camera, help constrain this model. UV camera measurements at Kilauea volcano, Hawaii, in May of 2010 captured two occurrences of lava filling and draining within the summit vent. Accompanying high lava stands were diminished SO2 emission rates, decreased seismic and infrasonic tremor, minor deflation, and slowed lava lake surface velocity. Incorporation of UV camera data into the multi-parameter dataset gives credence to the likelihood of shallow gas accumulation as the cause of such events.
Resumo:
The need for a stronger and more durable building material is becoming more important as the structural engineering field expands and challenges the behavioral limits of current materials. One of the demands for stronger material is rooted in the effects that dynamic loading has on a structure. High strain rates on the order of 101 s-1 to 103 s-1, though a small part of the overall types of loading that occur anywhere between 10-8 s-1 to 104 s-1 and at any point in a structures life, have very important effects when considering dynamic loading on a structure. High strain rates such as these can cause the material and structure to behave differently than at slower strain rates, which necessitates the need for the testing of materials under such loading to understand its behavior. Ultra high performance concrete (UHPC), a relatively new material in the U.S. construction industry, exhibits many enhanced strength and durability properties compared to the standard normal strength concrete. However, the use of this material for high strain rate applications requires an understanding of UHPC’s dynamic properties under corresponding loads. One such dynamic property is the increase in compressive strength under high strain rate load conditions, quantified as the dynamic increase factor (DIF). This factor allows a designer to relate the dynamic compressive strength back to the static compressive strength, which generally is a well-established property. Previous research establishes the relationships for the concept of DIF in design. The generally accepted methodology for obtaining high strain rates to study the enhanced behavior of compressive material strength is the split Hopkinson pressure bar (SHPB). In this research, 83 Cor-Tuf UHPC specimens were tested in dynamic compression using a SHPB at Michigan Technological University. The specimens were separated into two categories: ambient cured and thermally treated, with aspect ratios of 0.5:1, 1:1, and 2:1 within each category. There was statistically no significant difference in mean DIF for the aspect ratios and cure regimes that were considered in this study. DIF’s ranged from 1.85 to 2.09. Failure modes were observed to be mostly Type 2, Type 4, or combinations thereof for all specimen aspect ratios when classified according to ASTM C39 fracture pattern guidelines. The Comite Euro-International du Beton (CEB) model for DIF versus strain rate does not accurately predict the DIF for UHPC data gathered in this study. Additionally, a measurement system analysis was conducted to observe variance within the measurement system and a general linear model analysis was performed to examine the interaction and main effects that aspect ratio, cannon pressure, and cure method have on the maximum dynamic stress.
Resumo:
The fields of Rhetoric and Communication usually assume a competent speaker who is able to speak well with conscious intent; however, what happens when intent and comprehension are intact but communicative facilities are impaired (e.g., by stroke or traumatic brain injury)? What might a focus on communicative success be able to tell us in those instances? This project considers this question in examining communication disorders through identifying and analyzing patterns of (dis) fluent speech between 10 aphasic and 10 non-aphasic adults. The analysis in this report is centered on a collection of data provided by the Aphasia Bank database. The database’s collection protocol guides aphasic and non-aphasic participants through a series of language assessments, and for my re-analysis of the database’s transcripts I consider communicative success is and how it is demonstrated during a re-telling of the Cinderella narrative. I conducted a thorough examination of a set of participant transcripts to understand the contexts in which speech errors occur, and how (dis) fluencies may follow from aphasic and non-aphasic participant’s speech patterns. An inductive mixed-methods approach, informed by grounded theory, qualitative, and linguistic analyses of the transcripts functioned as a means to balance the classification of data, providing a foundation for all sampling decisions. A close examination of the transcripts and the codes of the Aphasia Bank database suggest that while the coding is abundant and detailed, that further levels of coding and analysis may be needed to reveal underlying similarities and differences in aphasic vs. non-aphasic linguistic behavior. Through four successive levels of increasingly detailed analysis, I found that patterns of repair by aphasics and non-aphasics differed primarily in degree rather than kind. This finding may have therapeutic impact, in reassuring aphasics that they are on the right track to achieving communicative fluency.
Resumo:
The main objectives of this thesis are to validate an improved principal components analysis (IPCA) algorithm on images; designing and simulating a digital model for image compression, face recognition and image detection by using a principal components analysis (PCA) algorithm and the IPCA algorithm; designing and simulating an optical model for face recognition and object detection by using the joint transform correlator (JTC); establishing detection and recognition thresholds for each model; comparing between the performance of the PCA algorithm and the performance of the IPCA algorithm in compression, recognition and, detection; and comparing between the performance of the digital model and the performance of the optical model in recognition and detection. The MATLAB © software was used for simulating the models. PCA is a technique used for identifying patterns in data and representing the data in order to highlight any similarities or differences. The identification of patterns in data of high dimensions (more than three dimensions) is too difficult because the graphical representation of data is impossible. Therefore, PCA is a powerful method for analyzing data. IPCA is another statistical tool for identifying patterns in data. It uses information theory for improving PCA. The joint transform correlator (JTC) is an optical correlator used for synthesizing a frequency plane filter for coherent optical systems. The IPCA algorithm, in general, behaves better than the PCA algorithm in the most of the applications. It is better than the PCA algorithm in image compression because it obtains higher compression, more accurate reconstruction, and faster processing speed with acceptable errors; in addition, it is better than the PCA algorithm in real-time image detection due to the fact that it achieves the smallest error rate as well as remarkable speed. On the other hand, the PCA algorithm performs better than the IPCA algorithm in face recognition because it offers an acceptable error rate, easy calculation, and a reasonable speed. Finally, in detection and recognition, the performance of the digital model is better than the performance of the optical model.
Resumo:
To analyze the characteristics and predict the dynamic behaviors of complex systems over time, comprehensive research to enable the development of systems that can intelligently adapt to the evolving conditions and infer new knowledge with algorithms that are not predesigned is crucially needed. This dissertation research studies the integration of the techniques and methodologies resulted from the fields of pattern recognition, intelligent agents, artificial immune systems, and distributed computing platforms, to create technologies that can more accurately describe and control the dynamics of real-world complex systems. The need for such technologies is emerging in manufacturing, transportation, hazard mitigation, weather and climate prediction, homeland security, and emergency response. Motivated by the ability of mobile agents to dynamically incorporate additional computational and control algorithms into executing applications, mobile agent technology is employed in this research for the adaptive sensing and monitoring in a wireless sensor network. Mobile agents are software components that can travel from one computing platform to another in a network and carry programs and data states that are needed for performing the assigned tasks. To support the generation, migration, communication, and management of mobile monitoring agents, an embeddable mobile agent system (Mobile-C) is integrated with sensor nodes. Mobile monitoring agents visit distributed sensor nodes, read real-time sensor data, and perform anomaly detection using the equipped pattern recognition algorithms. The optimal control of agents is achieved by mimicking the adaptive immune response and the application of multi-objective optimization algorithms. The mobile agent approach provides potential to reduce the communication load and energy consumption in monitoring networks. The major research work of this dissertation project includes: (1) studying effective feature extraction methods for time series measurement data; (2) investigating the impact of the feature extraction methods and dissimilarity measures on the performance of pattern recognition; (3) researching the effects of environmental factors on the performance of pattern recognition; (4) integrating an embeddable mobile agent system with wireless sensor nodes; (5) optimizing agent generation and distribution using artificial immune system concept and multi-objective algorithms; (6) applying mobile agent technology and pattern recognition algorithms for adaptive structural health monitoring and driving cycle pattern recognition; (7) developing a web-based monitoring network to enable the visualization and analysis of real-time sensor data remotely. Techniques and algorithms developed in this dissertation project will contribute to research advances in networked distributed systems operating under changing environments.