3 resultados para Continuous Markov processes
em Illinois Digital Environment for Access to Learning and Scholarship Repository
Resumo:
This study presents two novel methods for treating important environmental contaminants from two different wastewater streams. One process utilizes the kinetic advantages and reliability of ion exchanging clinoptilolite in combination with biological treatment to remove ammonium from municipal sewage. A second process, HAMBgR (Hybrid Adsorption Membrane Biological Reactor), combines both ion exchange resin and bacteria into a single reactor to treat perchlorate contaminated waters. Combining physicochemical adsorptive treatment with biological treatment can provide synergistic benefits to the overall removal processes. Ion exchange removal solves some of the common operational reliability limitations of biological treatment, like slow response to environmental changes and leaching. Biological activity can in turn help reduce the economic and environmental challenges of ion exchange processes, like regenerant cost and brine disposal. The second section of this study presents continuous flow column experiments, used to demonstrate the ability of clinoptilolite to remove wastewater ammonium, as well as the effectiveness of salt regeneration using highly concentrated sea salt solutions. The working capacity of clinoptilolite more than doubled over the first few loading cycles, while regeneration recovered more than 98% of ammonium. Using the regenerant brine for subsequent halotolerant algae growth allowed for its repeated use, which could lead to cost savings and production of valuable algal biomass. The algae were able to uptake all ammonium in solution, and the brine was able to be used again with no loss in regeneration efficiency. This process has significant advantages over conventional biological nitrification; shorter retention times, wider range of operational conditions, and higher quality effluent free of nitrate. Also, since the clinoptilolite is continually regenerated and the regenerant is rejuvenated by algae, overall input costs are expected to be low. The third section of this study introduces the HAMBgR process for the elimination of perchlorate and presents batch isotherm experiments and pilot reactor tests. Results showed that a variety of ion-exchange resins can be effectively and repeatedly regenerated biologically, and maintain an acceptable working capacity. The presence of an adsorbent in the HAMBgR process improved bioreactor performance during operational fluctuations by providing a physicochemical backup to the biological process. Pilot reactor tests showed that the HAMBgR process reduced effluent perchlorate spikes by up to 97% in comparison to a conventional membrane bio-reactor (MBR) that was subject to sudden changes in influent conditions. Also, the HAMBgR process stimulated biological activity and lead to higher biomass concentrations during increased contaminant loading conditions. Conventional MBR systems can be converted into HAMBgR’s at a low cost, easily justifiable by the realized benefits. The concepts employed in the HAMBgR process can be adapted to treat other target contaminants, not just perchlorate.
Resumo:
Knowledge is one of the most important assets for surviving in the modern business environment. The effective management of that asset mandates continuous adaptation by organizations, and requires employees to strive to improve the company's work processes. Organizations attempt to coordinate their unique knowledge with traditional means as well as in new and distinct ways, and to transform them into innovative resources better than those of their competitors. As a result, how to manage the knowledge asset has become a critical issue for modern organizations, and knowledge management is considered the most feasible solution. Knowledge management is a multidimensional process that identifies, acquires, develops, distributes, utilizes, and stores knowledge. However, many related studies focus only on fragmented or limited knowledge-management perspectives. In order to make knowledge management more effective, it is important to identify the qualitative and quantitative issues that are the foundation of the challenge of effective knowledge management in organizations. The main purpose of this study was to integrate the fragmented knowledge management perspectives into the holistic framework, which includes knowledge infrastructure capability (technology, structure, and culture) and knowledge process capability (acquisition, conversion, application, and protection), based on Gold's (2001) study. Additionally, because the effect of incentives ̶̶ which is widely acknowledged as a prime motivator in facilitating the knowledge management process ̶̶ was missing in the original framework, this study included the importance of incentives in the knowledge management framework. This study also identified the relationship of organizational performance from the standpoint of the Balanced Scorecard, which includes the customer-related, internal business process, learning & growth, and perceptual financial aspects of organizational performance in the Korean business context. Moreover, this study identified the relationship with the objective financial performance by calculating the Tobin's q ratio. Lastly, this study compared the group differences between larger and smaller organizations, and manufacturing and nonmanufacturing firms in the study of knowledge management. Since this study was conducted in Korea, the original instrument was translated into Korean through the back translation technique. A confirmatory factor analysis (CFA) was used to examine the validity and reliability of the instrument. To identify the relationship between knowledge management capabilities and organizational performance, structural equation modeling (SEM) and multiple regression analysis were conducted. A Student's t test was conducted to examine the mean differences. The results of this study indicated that there is a positive relationship between effective knowledge management and organizational performance. However, no empirical evidence was found to suggest that knowledge management capabilities are linked to the objective financial performance, which remains a topic for future review. Additionally, findings showed that knowledge management is affected by organization's size, but not by type of organization. The results of this study are valuable in establishing a valid and reliable survey instrument, as well as in providing strong evidence that knowledge management capabilities are essential to improving organizational performance currently and making important recommendations for future research.
Resumo:
Reliability and dependability modeling can be employed during many stages of analysis of a computing system to gain insights into its critical behaviors. To provide useful results, realistic models of systems are often necessarily large and complex. Numerical analysis of these models presents a formidable challenge because the sizes of their state-space descriptions grow exponentially in proportion to the sizes of the models. On the other hand, simulation of the models requires analysis of many trajectories in order to compute statistically correct solutions. This dissertation presents a novel framework for performing both numerical analysis and simulation. The new numerical approach computes bounds on the solutions of transient measures in large continuous-time Markov chains (CTMCs). It extends existing path-based and uniformization-based methods by identifying sets of paths that are equivalent with respect to a reward measure and related to one another via a simple structural relationship. This relationship makes it possible for the approach to explore multiple paths at the same time,· thus significantly increasing the number of paths that can be explored in a given amount of time. Furthermore, the use of a structured representation for the state space and the direct computation of the desired reward measure (without ever storing the solution vector) allow it to analyze very large models using a very small amount of storage. Often, path-based techniques must compute many paths to obtain tight bounds. In addition to presenting the basic path-based approach, we also present algorithms for computing more paths and tighter bounds quickly. One resulting approach is based on the concept of path composition whereby precomputed subpaths are composed to compute the whole paths efficiently. Another approach is based on selecting important paths (among a set of many paths) for evaluation. Many path-based techniques suffer from having to evaluate many (unimportant) paths. Evaluating the important ones helps to compute tight bounds efficiently and quickly.