8 resultados para Interval analysis (Mathematics)
em Digital Commons - Michigan Tech
Measuring energy spectra of TeV gamma-ray emission from the Cygnus region of our galaxy with Milagro
Resumo:
High energy gamma rays can provide fundamental clues to the origins of cosmic rays. In this thesis, TeV gamma-ray emission from the Cygnus region is studied. Previously the Milagro experiment detected five TeV gamma-ray sources in this region and a significant excess of TeV gamma rays whose origin is still unclear. To better understand the diffuse excess the separation of sources and diffuse emission is studied using the latest and most sensitive data set of the Milagro experiment. In addition, a newly developed technique is applied that allows the energy spectrum of the TeV gamma rays to be reconstructed using Milagro data. No conclusive statement can be made about the spectrum of the diffuse emission from the Cygnus region because of its low significance of 2.2 σ above the background in the studied data sample. The entire Cygnus region emission is best fit with a power law with a spectral index of α=2.40 (68% confidence interval: 1.35-2.92) and a exponential cutoff energy of 31.6 TeV (10.0-251.2 TeV). In the case of a simple power law assumption without a cutoff energy the best fit yields a spectral index of α=2.97 (68% confidence interval: 2.83-3.10). Neither of these best fits are in good agreement with the data. The best spectral fit to the TeV emission from MGRO J2019+37, the brightest source in the Cygnus region, yields a spectral index of α=2.30 (68% confidence interval: 1.40-2.70) with a cutoff energy of 50.1 TeV (68% confidence interval: 17.8-251.2 TeV) and a spectral index of α=2.75 (68% confidence interval: 2.65-2.85) when no exponential cutoff energy is assumed. According to the present analysis, MGRO J2019+37 contributes 25% to the differential flux from the entire Cygnus at 15 TeV.
Resumo:
The Collingwood Member is a mid to late Ordovician self-sourced reservoir deposited across the northern Michigan Basin and parts of Ontario, Canada. Although it had been previously studied in Canada, there has been relatively little data available from the Michigan subsurface. Recent commercial interest in the Collingwood has resulted in the drilling and production of several wells in the state of Michigan. An analysis of core samples, measured laboratory data, and petrophysical logs has yielded both a quantitative and qualitative understanding of the formation in the Michigan Basin. The Collingwood is a low permeability and low porosity carbonate package that is very high in organic content. It is composed primarily of a uniformly fine grained carbonate matrix with lesser amounts of kerogen, silica, and clays. The kerogen content of the Collingwood is finely dispersed in the clay and carbonate mineral phases. Geochemical and production data show that both oil and gas phases are present based on regional thermal maturity. The deposit is richest in the north-central part of the basin with thickest deposition and highest organic content. The Collingwood is a fairly thin deposit and vertical fractures may very easily extend into the surrounding formations. Completion and treatment techniques should be designed around these parameters to enhance production.
Resumo:
The Pacaya volcanic complex is part of the Central American volcanic arc, which is associated with the subduction of the Cocos tectonic plate under the Caribbean plate. Located 30 km south of Guatemala City, Pacaya is situated on the southern rim of the Amatitlan Caldera. It is the largest post-caldera volcano, and has been one of Central America’s most active volcanoes over the last 500 years. Between 400 and 2000 years B.P, the Pacaya volcano had experienced a huge collapse, which resulted in the formation of horseshoe-shaped scarp that is still visible. In the recent years, several smaller collapses have been associated with the activity of the volcano (in 1961 and 2010) affecting its northwestern flanks, which are likely to be induced by the local and regional stress changes. The similar orientation of dry and volcanic fissures and the distribution of new vents would likely explain the reactivation of the pre-existing stress configuration responsible for the old-collapse. This paper presents the first stability analysis of the Pacaya volcanic flank. The inputs for the geological and geotechnical models were defined based on the stratigraphical, lithological, structural data, and material properties obtained from field survey and lab tests. According to the mechanical characteristics, three lithotechnical units were defined: Lava, Lava-Breccia and Breccia-Lava. The Hoek and Brown’s failure criterion was applied for each lithotechnical unit and the rock mass friction angle, apparent cohesion, and strength and deformation characteristics were computed in a specified stress range. Further, the stability of the volcano was evaluated by two-dimensional analysis performed by Limit Equilibrium (LEM, ROCSCIENCE) and Finite Element Method (FEM, PHASE 2 7.0). The stability analysis mainly focused on the modern Pacaya volcano built inside the collapse amphitheatre of “Old Pacaya”. The volcanic instability was assessed based on the variability of safety factor using deterministic, sensitivity, and probabilistic analysis considering the gravitational instability and the effects of external forces such as magma pressure and seismicity as potential triggering mechanisms of lateral collapse. The preliminary results from the analysis provide two insights: first, the least stable sector is on the south-western flank of the volcano; second, the lowest safety factor value suggests that the edifice is stable under gravity alone, and the external triggering mechanism can represent a likely destabilizing factor.
Resumo:
After teaching regular education secondary mathematics for seven years, I accepted a position in an alternative education high school. Over the next four years, the State of Michigan adopted new graduation requirements phasing in a mandate for all students to complete Geometry and Algebra 2 courses. Since many of my students were already struggling in Algebra 1, getting them through Geometry and Algebra 2 seemed like a daunting task. To better instruct my students, I wanted to know how other teachers in similar situations were addressing the new High School Content Expectations (HSCEs) in upper level mathematics. This study examines how thoroughly alternative education teachers in Michigan are addressing the HSCEs in their courses, what approaches they have found most effective, and what issues are preventing teachers and schools from successfully implementing the HSCEs. Twenty-six alternative high school educators completed an online survey that included a variety of questions regarding school characteristics, curriculum alignment, implementation approaches and issues. Follow-up phone interviews were conducted with four of these participants. The survey responses were used to categorize schools as successful, unsuccessful, and neutral schools in terms of meeting the HSCEs. Responses from schools in each category were compared to identify common approaches and issues among them and to identify significant differences between school groups. Data analysis showed that successful schools taught more of the HSCEs through a variety of instructional approaches, with an emphasis on varying the ways students learned the material. Individualized instruction was frequently mentioned by successful schools and was strikingly absent from unsuccessful school responses. The main obstacle to successful implementation of the HSCEs identified in the study was gaps in student knowledge. This caused pace of instruction to also be a significant issue. School representatives were fairly united against the belief that the Algebra 2 graduation requirement was appropriate for all alternative education students. Possible implications of these findings are discussed.
Resumo:
The number of record-breaking events expected to occur in a strictly stationary time-series depends only on the number of values in the time-series, regardless of distribution. This holds whether the events are record-breaking highs or lows and whether we count from past to present or present to past. However, these symmetries are broken in distinct ways by trends in the mean and variance. We define indices that capture this information and use them to detect weak trends from multiple time-series. Here, we use these methods to answer the following questions: (1) Is there a variability trend among globally distributed surface temperature time-series? We find a significant decreasing variability over the past century for the Global Historical Climatology Network (GHCN). This corresponds to about a 10% change in the standard deviation of inter-annual monthly mean temperature distributions. (2) How are record-breaking high and low surface temperatures in the United States affected by time period? We investigate the United States Historical Climatology Network (USHCN) and find that the ratio of record-breaking highs to lows in 2006 increases as the time-series extend further into the past. When we consider the ratio as it evolves with respect to a fixed start year, we find it is strongly correlated with the ensemble mean. We also compare the ratios for USHCN and GHCN (minus USHCN stations). We find the ratios grow monotonically in the GHCN data set, but not in the USHCN data set. (3) Do we detect either mean or variance trends in annual precipitation within the United States? We find that the total annual and monthly precipitation in the United States (USHCN) has increased over the past century. Evidence for a trend in variance is inconclusive.
Resumo:
Nitrogen and water are essential for plant growth and development. In this study, we designed experiments to produce gene expression data of poplar roots under nitrogen starvation and water deprivation conditions. We found low concentration of nitrogen led first to increased root elongation followed by lateral root proliferation and eventually increased root biomass. To identify genes regulating root growth and development under nitrogen starvation and water deprivation, we designed a series of data analysis procedures, through which, we have successfully identified biologically important genes. Differentially Expressed Genes (DEGs) analysis identified the genes that are differentially expressed under nitrogen starvation or drought. Protein domain enrichment analysis identified enriched themes (in same domains) that are highly interactive during the treatment. Gene Ontology (GO) enrichment analysis allowed us to identify biological process changed during nitrogen starvation. Based on the above analyses, we examined the local Gene Regulatory Network (GRN) and identified a number of transcription factors. After testing, one of them is a high hierarchically ranked transcription factor that affects root growth under nitrogen starvation. It is very tedious and time-consuming to analyze gene expression data. To avoid doing analysis manually, we attempt to automate a computational pipeline that now can be used for identification of DEGs and protein domain analysis in a single run. It is implemented in scripts of Perl and R.
Resumo:
This thesis develops high performance real-time signal processing modules for direction of arrival (DOA) estimation for localization systems. It proposes highly parallel algorithms for performing subspace decomposition and polynomial rooting, which are otherwise traditionally implemented using sequential algorithms. The proposed algorithms address the emerging need for real-time localization for a wide range of applications. As the antenna array size increases, the complexity of signal processing algorithms increases, making it increasingly difficult to satisfy the real-time constraints. This thesis addresses real-time implementation by proposing parallel algorithms, that maintain considerable improvement over traditional algorithms, especially for systems with larger number of antenna array elements. Singular value decomposition (SVD) and polynomial rooting are two computationally complex steps and act as the bottleneck to achieving real-time performance. The proposed algorithms are suitable for implementation on field programmable gated arrays (FPGAs), single instruction multiple data (SIMD) hardware or application specific integrated chips (ASICs), which offer large number of processing elements that can be exploited for parallel processing. The designs proposed in this thesis are modular, easily expandable and easy to implement. Firstly, this thesis proposes a fast converging SVD algorithm. The proposed method reduces the number of iterations it takes to converge to correct singular values, thus achieving closer to real-time performance. A general algorithm and a modular system design are provided making it easy for designers to replicate and extend the design to larger matrix sizes. Moreover, the method is highly parallel, which can be exploited in various hardware platforms mentioned earlier. A fixed point implementation of proposed SVD algorithm is presented. The FPGA design is pipelined to the maximum extent to increase the maximum achievable frequency of operation. The system was developed with the objective of achieving high throughput. Various modern cores available in FPGAs were used to maximize the performance and details of these modules are presented in detail. Finally, a parallel polynomial rooting technique based on Newton’s method applicable exclusively to root-MUSIC polynomials is proposed. Unique characteristics of root-MUSIC polynomial’s complex dynamics were exploited to derive this polynomial rooting method. The technique exhibits parallelism and converges to the desired root within fixed number of iterations, making this suitable for polynomial rooting of large degree polynomials. We believe this is the first time that complex dynamics of root-MUSIC polynomial were analyzed to propose an algorithm. In all, the thesis addresses two major bottlenecks in a direction of arrival estimation system, by providing simple, high throughput, parallel algorithms.
Resumo:
Three-dimensional flow visualization plays an essential role in many areas of science and engineering, such as aero- and hydro-dynamical systems which dominate various physical and natural phenomena. For popular methods such as the streamline visualization to be effective, they should capture the underlying flow features while facilitating user observation and understanding of the flow field in a clear manner. My research mainly focuses on the analysis and visualization of flow fields using various techniques, e.g. information-theoretic techniques and graph-based representations. Since the streamline visualization is a popular technique in flow field visualization, how to select good streamlines to capture flow patterns and how to pick good viewpoints to observe flow fields become critical. We treat streamline selection and viewpoint selection as symmetric problems and solve them simultaneously using the dual information channel [81]. To the best of my knowledge, this is the first attempt in flow visualization to combine these two selection problems in a unified approach. This work selects streamline in a view-independent manner and the selected streamlines will not change for all viewpoints. My another work [56] uses an information-theoretic approach to evaluate the importance of each streamline under various sample viewpoints and presents a solution for view-dependent streamline selection that guarantees coherent streamline update when the view changes gradually. When projecting 3D streamlines to 2D images for viewing, occlusion and clutter become inevitable. To address this challenge, we design FlowGraph [57, 58], a novel compound graph representation that organizes field line clusters and spatiotemporal regions hierarchically for occlusion-free and controllable visual exploration. We enable observation and exploration of the relationships among field line clusters, spatiotemporal regions and their interconnection in the transformed space. Most viewpoint selection methods only consider the external viewpoints outside of the flow field. This will not convey a clear observation when the flow field is clutter on the boundary side. Therefore, we propose a new way to explore flow fields by selecting several internal viewpoints around the flow features inside of the flow field and then generating a B-Spline curve path traversing these viewpoints to provide users with closeup views of the flow field for detailed observation of hidden or occluded internal flow features [54]. This work is also extended to deal with unsteady flow fields. Besides flow field visualization, some other topics relevant to visualization also attract my attention. In iGraph [31], we leverage a distributed system along with a tiled display wall to provide users with high-resolution visual analytics of big image and text collections in real time. Developing pedagogical visualization tools forms my other research focus. Since most cryptography algorithms use sophisticated mathematics, it is difficult for beginners to understand both what the algorithm does and how the algorithm does that. Therefore, we develop a set of visualization tools to provide users with an intuitive way to learn and understand these algorithms.