7 resultados para Non-parametric regression methods
em Digital Commons - Michigan Tech
Resumo:
Utilizing remote sensing methods to assess landscape-scale ecological change are rapidly becoming a dominant force in the natural sciences. Powerful and robust non-parametric statistical methods are also actively being developed to compliment the unique characteristics of remotely sensed data. The focus of this research is to utilize these powerful, robust remote sensing and statistical approaches to shed light on woody plant encroachment into native grasslands--a troubling ecological phenomenon occurring throughout the world. Specifically, this research investigates western juniper encroachment within the sage-steppe ecosystem of the western USA. Western juniper trees are native to the intermountain west and are ecologically important by means of providing structural diversity and habitat for many species. However, after nearly 150 years of post-European settlement changes to this threatened ecosystem, natural ecological processes such as fire regimes no longer limit the range of western juniper to rocky refugia and other areas protected from short fire return intervals that are historically common to the region. Consequently, sage-steppe communities with high juniper densities exhibit negative impacts, such as reduced structural diversity, degraded wildlife habitat and ultimately the loss of biodiversity. Much of today's sage-steppe ecosystem is transitioning to juniper woodlands. Additionally, the majority of western juniper woodlands have not reached their full potential in both range and density. The first section of this research investigates the biophysical drivers responsible for juniper expansion patterns observed in the sage-steppe ecosystem. The second section is a comprehensive accuracy assessment of classification methods used to identify juniper tree cover from multispectral 1 m spatial resolution aerial imagery.
Resumo:
Civil infrastructure provides essential services for the development of both society and economy. It is very important to manage systems efficiently to ensure sound performance. However, there are challenges in information extraction from available data, which also necessitates the establishment of methodologies and frameworks to assist stakeholders in the decision making process. This research proposes methodologies to evaluate systems performance by maximizing the use of available information, in an effort to build and maintain sustainable systems. Under the guidance of problem formulation from a holistic view proposed by Mukherjee and Muga, this research specifically investigates problem solving methods that measure and analyze metrics to support decision making. Failures are inevitable in system management. A methodology is developed to describe arrival pattern of failures in order to assist engineers in failure rescues and budget prioritization especially when funding is limited. It reveals that blockage arrivals are not totally random. Smaller meaningful subsets show good random behavior. Additional overtime failure rate is analyzed by applying existing reliability models and non-parametric approaches. A scheme is further proposed to depict rates over the lifetime of a given facility system. Further analysis of sub-data sets is also performed with the discussion of context reduction. Infrastructure condition is another important indicator of systems performance. The challenges in predicting facility condition are the transition probability estimates and model sensitivity analysis. Methods are proposed to estimate transition probabilities by investigating long term behavior of the model and the relationship between transition rates and probabilities. To integrate heterogeneities, model sensitivity is performed for the application of non-homogeneous Markov chains model. Scenarios are investigated by assuming transition probabilities follow a Weibull regressed function and fall within an interval estimate. For each scenario, multiple cases are simulated using a Monte Carlo simulation. Results show that variations on the outputs are sensitive to the probability regression. While for the interval estimate, outputs have similar variations to the inputs. Life cycle cost analysis and life cycle assessment of a sewer system are performed comparing three different pipe types, which are reinforced concrete pipe (RCP) and non-reinforced concrete pipe (NRCP), and vitrified clay pipe (VCP). Life cycle cost analysis is performed for material extraction, construction and rehabilitation phases. In the rehabilitation phase, Markov chains model is applied in the support of rehabilitation strategy. In the life cycle assessment, the Economic Input-Output Life Cycle Assessment (EIO-LCA) tools are used in estimating environmental emissions for all three phases. Emissions are then compared quantitatively among alternatives to support decision making.
Resumo:
There is a need by engine manufactures for computationally efficient and accurate predictive combustion modeling tools for integration in engine simulation software for the assessment of combustion system hardware designs and early development of engine calibrations. This thesis discusses the process for the development and validation of a combustion modeling tool for Gasoline Direct Injected Spark Ignited Engine with variable valve timing, lift and duration valvetrain hardware from experimental data. Data was correlated and regressed from accepted methods for calculating the turbulent flow and flame propagation characteristics for an internal combustion engine. A non-linear regression modeling method was utilized to develop a combustion model to determine the fuel mass burn rate at multiple points during the combustion process. The computational fluid dynamic software Converge ©, was used to simulate and correlate the 3-D combustion system, port and piston geometry to the turbulent flow development within the cylinder to properly predict the experimental data turbulent flow parameters through the intake, compression and expansion processes. The engine simulation software GT-Power © is then used to determine the 1-D flow characteristics of the engine hardware being tested to correlate the regressed combustion modeling tool to experimental data to determine accuracy. The results of the combustion modeling tool show accurate trends capturing the combustion sensitivities to turbulent flow, thermodynamic and internal residual effects with changes in intake and exhaust valve timing, lift and duration.
Resumo:
Data of the strength of Earth’s magnetic field (paleointensity) in the geological past are crucial for understanding the geodynamo. Conventional paleointensity determination methods require heating a sample to a high temperature in one or more steps. Consequently, many rocks are unsuitable for these methods due to a heating-induced experimental alteration. Alternative non-heating paleointensity methods are investigated to assess their effectiveness and reliability using both natural samples from Lemptégy Volcano, France, and synthetic samples. Paleointensity was measured from the natural and synthetic samples using the Pseudo-Thellier, ARM, REM, REMc, REM’, and Preisach methods. For the natural samples, only the Pseudo-Thellier method was able to produce a reasonable paleointensity estimate consistent with previous paleointensity data. The synthetic samples yielded more successful estimates using all the methods, with the Pseudo-Thellier and ARM methods producing the most accurate results. The Pseudo-Thellier method appears to be the best alternative to the heating-based paleointensity methods.
Resumo:
Synthetic oligonucleotides and peptides have found wide applications in industry and academic research labs. There are ~60 peptide drugs on the market and over 500 under development. The global annual sale of peptide drugs in 2010 was estimated to be $13 billion. There are three oligonucleotide-based drugs on market; among them, the FDA newly approved Kynamro was predicted to have a $100 million annual sale. The annual sale of oligonucleotides to academic labs was estimated to be $700 million. Both bio-oligomers are mostly synthesized on automated synthesizers using solid phase synthesis technology, in which nucleoside or amino acid monomers are added sequentially until the desired full-length sequence is reached. The additions cannot be complete, which generates truncated undesired failure sequences. For almost all applications, these impurities must be removed. The most widely used method is HPLC. However, the method is slow, expensive, labor-intensive, not amendable for automation, difficult to scale up, and unsuitable for high throughput purification. It needs large capital investment, and consumes large volumes of harmful solvents. The purification costs are estimated to be more than 50% of total production costs. Other methods for bio-oligomer purification also have drawbacks, and are less favored than HPLC for most applications. To overcome the problems of known biopolymer purification technologies, we have developed two non-chromatographic purification methods. They are (1) catching failure sequences by polymerization, and (2) catching full-length sequences by polymerization. In the first method, a polymerizable group is attached to the failure sequences of the bio-oligomers during automated synthesis; purification is achieved by simply polymerizing the failure sequences into an insoluble gel and extracting full-length sequences. In the second method, a polymerizable group is attached to the full-length sequences, which are then incorporated into a polymer; impurities are removed by washing, and pure product is cleaved from polymer. These methods do not need chromatography, and all drawbacks of HPLC no longer exist. Using them, purification is achieved by simple manipulations such as shaking and extraction. Therefore, they are suitable for large scale purification of oligonucleotide and peptide drugs, and also ideal for high throughput purification, which currently has a high demand for research projects involving total gene synthesis. The dissertation will present the details about the development of the techniques. Chapter 1 will make an introduction to oligodeoxynucleotides (ODNs), their synthesis and purification. Chapter 2 will describe the detailed studies of using the catching failure sequences by polymerization method to purify ODNs. Chapter 3 will describe the further optimization of the catching failure sequences by polymerization ODN purification technology to the level of practical use. Chapter 4 will present using the catching full-length sequence by polymerization method for ODN purification using acid-cleavable linker. Chapter 5 will make an introduction to peptides, their synthesis and purification. Chapter 6 will describe the studies using the catching full-length sequence by polymerization method for peptide purification.
Resumo:
Transformer protection is one of the most challenging applications within the power system protective relay field. Transformers with a capacity rating exceeding 10 MVA are usually protected using differential current relays. Transformers are an aging and vulnerable bottleneck in the present power grid; therefore, quick fault detection and corresponding transformer de-energization is the key element in minimizing transformer damage. Present differential current relays are based on digital signal processing (DSP). They combine DSP phasor estimation and protective-logic-based decision making. The limitations of existing DSP-based differential current relays must be identified to determine the best protection options for sensitive and quick fault detection. The development, implementation, and evaluation of a DSP differential current relay is detailed. The overall goal is to make fault detection faster without compromising secure and safe transformer operation. A detailed background on the DSP differential current relay is provided. Then different DSP phasor estimation filters are implemented and evaluated based on their ability to extract desired frequency components from the measured current signal quickly and accurately. The main focus of the phasor estimation evaluation is to identify the difference between using non-recursive and recursive filtering methods. Then the protective logic of the DSP differential current relay is implemented and required settings made in accordance with transformer application. Finally, the DSP differential current relay will be evaluated using available transformer models within the ATP simulation environment. Recursive filtering methods were found to have significant advantage over non-recursive filtering methods when evaluated individually and when applied in the DSP differential relay. Recursive filtering methods can be up to 50% faster than non-recursive methods, but can cause false trip due to overshoot if the only objective is speed. The relay sensitivity is however independent of filtering method and depends on the settings of the relay’s differential characteristics (pickup threshold and percent slope).
Resumo:
The Zagros oak forests in Western Iran are critically important to the sustainability of the region. These forests have undergone dramatic declines in recent decades. We evaluated the utility of the non-parametric Random Forest classification algorithm for land cover classification of Zagros landscapes, and selected the best spatial and spectral predictive variables. The algorithm resulted in high overall classification accuracies (>85%) and also equivalent classification accuracies for the datasets from the three different sensors. We evaluated the associations between trends in forest area and structure with trends in socioeconomic and climatic conditions, to identify the most likely driving forces creating deforestation and landscape structure change. We used available socioeconomic (urban and rural population, and rural income), and climatic (mean annual rainfall and mean annual temperature) data for two provinces in northern Zagros. The most correlated driving force of forest area loss was urban population, and climatic variables to a lesser extent. Landscape structure changes were more closely associated with rural population. We examined the effects of scale changes on the results from spatial pattern analysis. We assessed the impacts of eight years of protection in a protected area in northern Zagros at two different scales (both grain and extent). The effects of protection on the amount and structure of forests was scale dependent. We evaluated the nature and magnitude of changes in forest area and structure over the entire Zagros region from 1972 to 2009. We divided the Zagros region in 167 Landscape Units and developed two measures— Deforestation Sensitivity (DS) and Connectivity Sensitivity (CS) — for each landscape unit as the percent of the time steps that forest area and ECA experienced a decrease of greater than 10% in either measure. A considerable loss in forest area and connectivity was detected, but no sudden (nonlinear) changes were detected at the spatial and temporal scale of the study. Connectivity loss occurred more rapidly than forest loss due to the loss of connecting patches. More connectivity was lost in southern Zagros due to climatic differences and different forms of traditional land use.