11 resultados para Open Information Extraction

em Digital Commons - Michigan Tech


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Civil infrastructure provides essential services for the development of both society and economy. It is very important to manage systems efficiently to ensure sound performance. However, there are challenges in information extraction from available data, which also necessitates the establishment of methodologies and frameworks to assist stakeholders in the decision making process. This research proposes methodologies to evaluate systems performance by maximizing the use of available information, in an effort to build and maintain sustainable systems. Under the guidance of problem formulation from a holistic view proposed by Mukherjee and Muga, this research specifically investigates problem solving methods that measure and analyze metrics to support decision making. Failures are inevitable in system management. A methodology is developed to describe arrival pattern of failures in order to assist engineers in failure rescues and budget prioritization especially when funding is limited. It reveals that blockage arrivals are not totally random. Smaller meaningful subsets show good random behavior. Additional overtime failure rate is analyzed by applying existing reliability models and non-parametric approaches. A scheme is further proposed to depict rates over the lifetime of a given facility system. Further analysis of sub-data sets is also performed with the discussion of context reduction. Infrastructure condition is another important indicator of systems performance. The challenges in predicting facility condition are the transition probability estimates and model sensitivity analysis. Methods are proposed to estimate transition probabilities by investigating long term behavior of the model and the relationship between transition rates and probabilities. To integrate heterogeneities, model sensitivity is performed for the application of non-homogeneous Markov chains model. Scenarios are investigated by assuming transition probabilities follow a Weibull regressed function and fall within an interval estimate. For each scenario, multiple cases are simulated using a Monte Carlo simulation. Results show that variations on the outputs are sensitive to the probability regression. While for the interval estimate, outputs have similar variations to the inputs. Life cycle cost analysis and life cycle assessment of a sewer system are performed comparing three different pipe types, which are reinforced concrete pipe (RCP) and non-reinforced concrete pipe (NRCP), and vitrified clay pipe (VCP). Life cycle cost analysis is performed for material extraction, construction and rehabilitation phases. In the rehabilitation phase, Markov chains model is applied in the support of rehabilitation strategy. In the life cycle assessment, the Economic Input-Output Life Cycle Assessment (EIO-LCA) tools are used in estimating environmental emissions for all three phases. Emissions are then compared quantitatively among alternatives to support decision making.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

In recent years, advanced metering infrastructure (AMI) has been the main research focus due to the traditional power grid has been restricted to meet development requirements. There has been an ongoing effort to increase the number of AMI devices that provide real-time data readings to improve system observability. Deployed AMI across distribution secondary networks provides load and consumption information for individual households which can improve grid management. Significant upgrade costs associated with retrofitting existing meters with network-capable sensing can be made more economical by using image processing methods to extract usage information from images of the existing meters. This thesis presents a new solution that uses online data exchange of power consumption information to a cloud server without modifying the existing electromechanical analog meters. In this framework, application of a systematic approach to extract energy data from images replaces the manual reading process. One case study illustrates the digital imaging approach is compared to the averages determined by visual readings over a one-month period.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this thesis, I study skin lesion detection and its applications to skin cancer diagnosis. A skin lesion detection algorithm is proposed. The proposed algorithm is based color information and threshold. For the proposed algorithm, several color spaces are studied and the detection results are compared. Experimental results show that YUV color space can achieve the best performance. Besides, I develop a distance histogram based threshold selection method and the method is proven to be better than other adaptive threshold selection methods for color detection. Besides the detection algorithms, I also investigate GPU speed-up techniques for skin lesion extraction and the results show that GPU has potential applications in speeding-up skin lesion extraction. Based on the skin lesion detection algorithms proposed, I developed a mobile-based skin cancer diagnosis application. In this application, the user with an iPhone installed with the proposed application can use the iPhone as a diagnosis tool to find the potential skin lesions in a persons' skin and compare the skin lesions detected by the iPhone with the skin lesions stored in a database in a remote server.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bioplastics are polymers (such as polyesters) produced from bacterial fermentations that are biodegradable and nonhazardous. They are produced by a wide variety of bacteria and are made only when stress conditions allow, such as when nutrient levels are low, more specifically levels of nitrogen and oxygen. These stress conditions cause certain bacteria to build up excess carbon deposits as energy reserves in the form of polyhydroxyalkanoates (PHAs). PHAs can be extracted and formed into actual plastic with the same strength of conventional, synthetic-based plastics without the need to rely on foreign petroleum. The overall goal of this project was to select for a bacteria that could grow on sugars found in the lignocellulosic biomass, and get the bacteria to produce PHAs and peptidoglycan. Once this was accomplished the goal was to extract PHAs and peptidoglycan in order to make a stronger more rigid plastic, by combing them into a co-polymer. The individual goals of this project were to: (1) Select and screen bacteria that are capable of producing PHAs by utilizing the carbon/energy sources found in lignocellulosic biomass; (2) Maximize the utilization of those sugars present in woody biomass in order to produce optimal levels of PHAs. (3) Use room temperature ionic liquids (RTILs) in order to separate the cell membrane and peptidoglycan, allowing for better extraction of PHAs and more intact peptidoglycan. B. megaterium a Gram-positive PHA-producing bacterium was selected for study in this project. It was grown on a variety of different substrates in order to maximize both its growth and production of PHAs. The optimal conditions were found to be 30°C, pH 6.0 and sugar concentration of either 30g/L glucose or xylose. After optimal growth was obtained, both RTILs and enzymatic treatments were used to break the cell wall, in order to extract the PHAs, and peptidoglycan. PHAs and peptidoglycan were successfully extracted from the cell, and will be used in the future to create a new stronger co-polymer. Peptidoglycan recovery yield was 16% of the cells’ dry weight.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Supercritical carbon dioxide is used to exfoliate graphite, producing a small, several-layer graphitic flake. The supercritical conditions of 2000, 2500, and 3000 psi and temperatures of 40°, 50°, and 60°C, have been used to study the effect of critical density on the sizes and zeta potentials of the treated flakes. Photon Correlation Spectroscopy (PCS), Brunauer-Emmett-Teller (BET) surface area measurement, field emission scanning electron microscopy (FE-SEM), and atomic force microscopy (AFM) are used to observe the features of the flakes. N-methyl-2-pyrrolidinone (NMP), dimethylformamide (DMF), and isopropanol are used as co-solvents to enhance the supercritical carbon dioxide treatment. As a result, the PCS results show that the flakes obtained from high critical density treatment (low temperature and high pressure) are more stable due to more negative charges of zeta potential, but have smaller sizes than those from low critical density (high temperature and low pressure). However, when an additional 1-hour sonication is applied, the size of the flakes from low critical density treatment becomes smaller than those from high critical density treatment. This is probably due to more CO2 molecules stacked between the layers of the graphitic flakes. The zeta potentials of the sonicated samples were slightly more negative than nonsonicated samples. NMP and DMF co-solvents maintain stability and prevented reaggregation of the flakes better than isopropanol. The flakes tend to be larger and more stable as the treatment time increases since larger flat area of graphite is exfoliated. In these experiments, the temperature has more impact on the flakes than pressure. The BET surface area resultsshow that CO2 penetrates the graphite layers more than N2. Moreover, the negative surface area of the treated graphite indicates that the CO2 molecules may be adsorbed between the graphite layers during supercritical treatment. The FE-SEM and AFM images show that the flakes have various shapes and sizes. The effects of surfactants can be observed on the FE-SEM images of the samples in one percent by weight solution of SDBS in water since the sodium dodecylbenzene sulfonate (SDBS) residue covers all of the remaining flakes. The AFM images show that the vertical thickness of the graphitic flakes can ranges from several nanometers (less than ten layers thick), to more than a hundred nanometers. In conclusion, supercritical carbon dioxide treatment is a promising step compared to mechanical and chemical exfoliation techniques in the large scale production of thin graphitic flake, breaking down the graphite flakes into flakes only a fewer graphene layers thick.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information management is a key aspect of successful construction projects. Having inaccurate measurements and conflicting data can lead to costly mistakes, and vague quantities can ruin estimates and schedules. Building information modeling (BIM) augments a 3D model with a wide variety of information, which reduces many sources of error and can detect conflicts before they occur. Because new technology is often more complex, it can be difficult to effectively integrate it with existing business practices. In this paper, we will answer two questions: How can BIM add value to construction projects? and What lessons can be learned from other companies that use BIM or other similar technology? Previous research focused on the technology as if it were simply a tool, observing problems that occurred while integrating new technology into existing practices. Our research instead looks at the flow of information through a company and its network, seeing all the actors as part of an ecosystem. Building upon this idea, we proposed the metaphor of an information supply chain to illustrate how BIM can add value to a construction project. This paper then concludes with two case studies. The first case study illustrates a failure in the flow of information that could have prevented by using BIM. The second case study profiles a leading design firm that has used BIM products for many years and shows the real benefits of using this program.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A post classification change detection technique based on a hybrid classification approach (unsupervised and supervised) was applied to Landsat Thematic Mapper (TM), Landsat Enhanced Thematic Plus (ETM+), and ASTER images acquired in 1987, 2000 and 2004 respectively to map land use/cover changes in the Pic Macaya National Park in the southern region of Haiti. Each image was classified individually into six land use/cover classes: built-up, agriculture, herbaceous, open pine forest, mixed forest, and barren land using unsupervised ISODATA and maximum likelihood supervised classifiers with the aid of field collected ground truth data collected in the field. Ground truth information, collected in the field in December 2007, and including equalized stratified random points which were visual interpreted were used to assess the accuracy of the classification results. The overall accuracy of the land classification for each image was respectively: 1987 (82%), 2000 (82%), 2004 (87%). A post classification change detection technique was used to produce change images for 1987 to 2000, 1987 to 2004, and 2000 to 2004. It was found that significant changes in the land use/cover occurred over the 17- year period. The results showed increases in built up (from 10% to 17%) and herbaceous (from 5% to 14%) areas between 1987 and 2004. The increase of herbaceous was mostly caused by the abandonment of exhausted agriculture lands. At the same time, open pine forest and mixed forest areas lost (75%) and (83%) of their area to other land use/cover types. Open pine forest (from 20% to 14%) and mixed forest (from18 to 12%) were transformed into agriculture area or barren land. This study illustrated the continuing deforestation, land degradation and soil erosion in the region, which in turn is leading to decrease in vegetative cover. The study also showed the importance of Remote Sensing (RS) and Geographic Information System (GIS) technologies to estimate timely changes in the land use/cover, and to evaluate their causes in order to design an ecological based management plan for the park.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Three-dimensional flow visualization plays an essential role in many areas of science and engineering, such as aero- and hydro-dynamical systems which dominate various physical and natural phenomena. For popular methods such as the streamline visualization to be effective, they should capture the underlying flow features while facilitating user observation and understanding of the flow field in a clear manner. My research mainly focuses on the analysis and visualization of flow fields using various techniques, e.g. information-theoretic techniques and graph-based representations. Since the streamline visualization is a popular technique in flow field visualization, how to select good streamlines to capture flow patterns and how to pick good viewpoints to observe flow fields become critical. We treat streamline selection and viewpoint selection as symmetric problems and solve them simultaneously using the dual information channel [81]. To the best of my knowledge, this is the first attempt in flow visualization to combine these two selection problems in a unified approach. This work selects streamline in a view-independent manner and the selected streamlines will not change for all viewpoints. My another work [56] uses an information-theoretic approach to evaluate the importance of each streamline under various sample viewpoints and presents a solution for view-dependent streamline selection that guarantees coherent streamline update when the view changes gradually. When projecting 3D streamlines to 2D images for viewing, occlusion and clutter become inevitable. To address this challenge, we design FlowGraph [57, 58], a novel compound graph representation that organizes field line clusters and spatiotemporal regions hierarchically for occlusion-free and controllable visual exploration. We enable observation and exploration of the relationships among field line clusters, spatiotemporal regions and their interconnection in the transformed space. Most viewpoint selection methods only consider the external viewpoints outside of the flow field. This will not convey a clear observation when the flow field is clutter on the boundary side. Therefore, we propose a new way to explore flow fields by selecting several internal viewpoints around the flow features inside of the flow field and then generating a B-Spline curve path traversing these viewpoints to provide users with closeup views of the flow field for detailed observation of hidden or occluded internal flow features [54]. This work is also extended to deal with unsteady flow fields. Besides flow field visualization, some other topics relevant to visualization also attract my attention. In iGraph [31], we leverage a distributed system along with a tiled display wall to provide users with high-resolution visual analytics of big image and text collections in real time. Developing pedagogical visualization tools forms my other research focus. Since most cryptography algorithms use sophisticated mathematics, it is difficult for beginners to understand both what the algorithm does and how the algorithm does that. Therefore, we develop a set of visualization tools to provide users with an intuitive way to learn and understand these algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Extracellular iron reduction has been suggested as a candidate metabolic pathway that may explain a large proportion of carbon respiration in temperate peatlands. However, the o-phenanthroline colorimetric method commonly employed to quantitate iron and partition between redox species is known to be unreliable in the presence of humic and fulvic acids, both of which represent a considerable proportion of peatland dissolved organic matter. We propose ionic liquid extraction as a more accurate iron quantitation and redox speciation method in humic-rich peat porewater. We evaluated both o-phenanthroline and ionic liquid extraction in four distinct peatland systems spanning a gradient of physico-chemical conditions to compare total iron recovery and Fe2+:Fe3+ ratios determined by each method. Ionic liquid extraction was found to provide more accurate iron quantitation and speciation in the presence of dissolved organic matter. A multivariate approach utilizing fluorescence- and UV-Vis spectroscopy was used to identify dissolved organic matter characteristics in peat porewater that lead to poor performance of the o-phenanthroline method. Where these interferences are present, we offer an empirical correction factor for total iron quantitation by o-phenanthroline, as verified by ionic liquid extraction. The written work presented in this thesis is in preparation for submission to Soil Biology and Biochemisrty by T.J. Veverica, E.S. Kane, A.M. Marcarelli, and S.A. Green.