968 resultados para Solution-process
Resumo:
This paper presents a methodology to determine the parameters used in the simulation of delamination in composite materials using decohesion finite elements. A closed-form expression is developed to define the stiffness of the cohesive layer. A novel procedure that allows the use of coarser meshes of decohesion elements in large-scale computations is proposed. The procedure ensures that the energy dissipated by the fracture process is correctly computed. It is shown that coarse-meshed models defined using the approach proposed here yield the same results as the models with finer meshes normally used in the simulation of fracture processes
Resumo:
In many industries, such as petroleum production, and the petrochemical, metal, food and cosmetics industries, wastewaters containing an emulsion of oil in water are often produced. The emulsions consist of water (up to 90%), oils (mineral, animal, vegetable and synthetic), surfactants and other contaminates. In view of its toxic nature and its deleterious effects on the surrounding environment (soil, water) such wastewater needs to be treated before release into natural water ways. Membrane-based processes have successfully been applied in industrial applications and are considered as possible candidates for the treatment of oily wastewaters. Easy operation, lower cost, and in some cases, the ability to reduce contaminants below existing pollution limits are the main advantages of these systems. The main drawback of membranes is flux decline due tofouling and concentration polarisation. The complexity of oil-containing systems demands complementary studies on issues related to the mitigation of fouling and concentration polarisation in membranebased ultrafiltration. In this thesis the effect of different operating conditions (factors) on ultrafiltration of oily water is studied. Important factors are normally correlated and, therefore, their effect should be studied simultaneously. This work uses a novel approach to study different operating conditions, like pressure, flow velocity, and temperature, and solution properties, like oil concentration (cutting oil, diesel, kerosene), pH, and salt concentration (CaCl2 and NaCl)) in the ultrafiltration of oily water, simultaneously and in a systematic way using an experimental design approach. A hypothesis is developed to describe the interaction between the oil drops, salt and the membrane surface. The optimum conditions for ultrafiltration and the contribution of each factor in the ultrafiltration of oily water are evaluated. It is found that the effect on permeate flux of the various factors studied strongly depended on the type of oil, the type of membrane and the amount of salts. The thesis demonstrates that a system containing oil is very complex, and that fouling and flux decline can be observed even at very low pressures. This means that only the weak form of the critical flux exists for such systems. The cleaning of the fouled membranes and the influence of different parameters (flow velocity, temperature, time, pressure, and chemical concentration (SDS, NaOH)) were evaluated in this study. It was observed that fouling, and consequently cleaning, behaved differently for the studied membranes. Of the membranes studied, the membrane with the lowest propensity for fouling and the most easily cleaned was the regenerated cellulose membrane (C100H). In order to get more information about the interaction between the membrane and the components of the emulsion, a streaming potential study was performed on the membrane. The experiments were carried out at different pH and oil concentration. It was seen that oily water changed the surface charge of the membrane significantly. The surface charge and the streaming potential during different stages of filtration were measured and analysed being a new method for fouling of oil in this thesis. The surface charge varied in different stages of filtration. It was found that the surface charge of a cleaned membrane was not the same as initially; however, the permeability was equal to that of a virgin membrane. The effect of filtration mode was studied by performing the filtration in both cross-flow and deadend mode. The effect of salt on performance was considered in both studies. It was found that salt decreased the permeate flux even at low concentration. To test the effect of hydrophilicity change, the commercial membranes used in this thesis were modified by grafting (PNIPAAm) on their surfaces. A new technique (corona treatment) was used for this modification. The effect of modification on permeate flux and retention was evaluated. The modified membranes changed their pore size around 33oC resulting in different retention and permeability. The obtained results in this thesis can be applied to optimise the operation of a membrane plant under normal or shock conditions or to modify the process such that it becomes more efficient or effective.
Resumo:
A 1µs Molecular Dynamic simulation was performed with a realistic model system of Sodium Dodecyl Sulfate (SDS) micelles in aqueous solution, comprising of 360 DS-, 360 Na+ and 90000 water particles. After 300 ns three different micellar shapes and sizes 41, 68 and 95 monomers, were observed. The process led to stabilization in the total number of SDS clusters and an increase in the micellar radius to 2.23 nm, in agreement with experimental results. An important conclusion, is be aware that simulations employed in one aggregate, should be considered as a constraint. Size and shape distribution must be analyzed.
Resumo:
This research was motivated by the need to examine the potential application areas of process intensification technologies in Neste Oil Oyj. According to the company’s interest membrane reactor technology was chosen and applicability of this technology in refining industry was investigated. Moreover, Neste Oil suggested a project which is related to the CO2 capture from FCC unit flue gas stream. The flowrate of the flue gas is 180t/h and consist of approximately 14% by volume CO2. Membrane based absorption process (membrane contactor) was chosen as a potential technique to model CO2 capture from fluid catalytic cracking (FCC) unit effluent. In the design of membrane contactor, a mathematical model was developed to describe CO2 absorption from a gas mixture using monoethanole amine (MEA) aqueous solution. According to the results of literature survey, in the hollow fiber contactor for laminar flow conditions approximately 99 % percent of CO2 can be removed by using a 20 cm in length polyvinylidene fluoride (PDVF) membrane. Furthermore, the design of whole process was performed by using PRO/II simulation software and the CO2 removal efficiency of the whole process obtained as 97 %. The technical and economical comparisons among existing MEA absorption processes were performed to determine the advantages and disadvantages of membrane contactor technology.
Resumo:
The pollution and toxicity problems posed by arsenic in the environment have long been established. Hence, the removal and recovery remedies have been sought, bearing in mind the efficiency, cost effectiveness and environmental friendliness of the methods employed. The sorption kinetics and intraparticulate diffusivity of As (III) bioremediation from aqueous solution using modified and unmodified coconut fiber was investigated. The amount adsorbed increased as time increased, reaching equilibrium at about 60 minutes. The kinetic studies showed that the sorption rates could be described by both pseudo-first order and pseudo-second order process with the later showing a better fit with a value of rate constant of 1.16 x 10-4 min-1 for the three adsorbent types. The mechanism of sorption was found to be particle diffusion controlled. The diffusion and boundary layer effects were also investigation. Therefore, the results show that coconut fiber, both modified and unmodified is an efficient sorbent for the removal of As (III) from industrial effluents with particle diffusion as the predominant mechanism.
Resumo:
The need to clean-up heavy metal contaminated environment can not be over emphasized. This paper describes the adsorption isotherm studies of Cd (II), Pb (II) and Zn (II) ions from aqueous solution using unmodified and EDTA-modified maize cob. Maize cob was found to be an excellent adsorbent for the removal of these metal ions. The amount of metal ions adsorbed increased as the initial concentration increased. Also, EDTA - modification enhanced the adsorption capacity of maize cob probably due to the chelating ability of EDTA. Among the three adsorption isotherm tested, Dubinin-Radushkevich gave the best fit with R² value ranging from 0.9539 to 0.9973 and an average value of 0.9819. This is followed by Freundlich isotherm (Ave. 0.9783) and then the Langmuir isotherm (Ave. 0.7637). The sorption process was found to be a physiosorption process as seen from the apparent energy of adsorption which ranged from 2.05KJ\mol to 4.56KJ\mol. Therefore, this study demonstrates that maize cob which is an environmental pollutant could be used to adsorb heavy metals and achieve cleanliness thereby abating environmental nuisance caused by the maize cob.
Resumo:
A company’s capability to map out its cost position compared to other market players is important for competitive decision making. One aspect of cost position is direct product cost that illustrates the cost efficiency of a company’s product designs. If a company can evaluate and compare its own and other market players’ direct product costs, it can implement better decisions in product development and management, manufacturing, sourcing, etc. The main objective of this thesis was to develop a cost evaluation process for competitors’ products. This objective includes a process description and an analysis tool for cost evaluations. Additionally, process implementation is discussed as well. The main result of this thesis was a process description consisting of a sixteen steps process and an Excel based analysis tool. Since literature was quite limited in this field, the solution proposal was combined from many different theoretical concepts. It includes influences from reverse engineering, product cost assessment, benchmarking and cost based decision making. This solution proposal will lead to more systematic and standardized cost position analyses and result in better cost transparency in decision making.
Resumo:
The main outcome of the master thesis is innovative solution, which can support a choice of business process modeling methodology. Potential users of this tool are people with background in business process modeling and possibilities to collect required information about organization’s business processes. Master thesis states the importance of business process modeling in implementation of strategic goals of organization. It is made by revealing the place of the concept in Business Process Management (BPM) and its particular case Business Process Reengineering (BPR). In order to support the theoretical outcomes of the thesis a case study of Northern Dimension Research Centre (NORDI) in Lappeenranta University of Technology was conducted. On its example several solutions are shown: how to apply business process modeling methodologies in practice; in which way business process models can be useful for BPM and BPR initiatives; how to apply proposed innovative solution for a choice of business process modeling methodology.
Resumo:
Information technology service management has become an important task in delivering information for management purposes. Especially applications containing information for decision making need to be agile and ready for changes. The aim of this study was to find a solution for successful implementation of an ITIL based change management process to enterprise resource management applications managed by the target organization. Literature review of the study introduces frameworks that are important for success of an IT project implementation. In addition an overview of ITIL and ITIL based change management process is presented. The result of the study was a framework of actions that are needed to accomplish to be successful in change management process implementation. It was noticed that defining success criterions, critical success factors and success measures is important for achieving the goals of the implementation project.
Resumo:
This study examined solution business models and how they could be applied into energy efficiency business. The target of this study was to find out, what a functional solution business model applied to energy efficiency improvement projects is like. The term “functionality” was used to refer not only to the economic viability but to environmental and legal aspects and also to the implement of Critical Success Factors (CSFs) and the ability to overcome the most important market barriers and risks. This thesis is based on a comprehensive literature study on solution business, business models and energy efficiency business. This literature review was used as a foundation to an energy efficiency solution business model scheme. The created scheme was tested in a case study which studied two different energy efficiency improvement projects, illustrated the functionality of the created business model and evaluated their potential as customer targets. Solution approach was found to be suitable for energy efficiency business. The most important characteristics of a good solution business model were identified to be the relationship between the supplier and customer, a proper network, knowledge on the customer’s process and supreme technological expertise. Thus the energy efficiency solution business was recognized to be particularly suitable for example for energy suppliers or technological equipment suppliers. Because the case study was not executed from a certain company’s point of view, the most important factors such as relationships and the availability of funding could not be evaluated. Although the energy efficiency business is recognized to be economically viable, the most important factors influencing the profitability and the success of energy efficiency solution business model were identified to be the proper risk management, the ability to overcome market barriers and the realization of CSFs.
Resumo:
Formal software development processes and well-defined development methodologies are nowadays seen as the definite way to produce high-quality software within time-limits and budgets. The variety of such high-level methodologies is huge ranging from rigorous process frameworks like CMMI and RUP to more lightweight agile methodologies. The need for managing this variety and the fact that practically every software development organization has its own unique set of development processes and methods have created a profession of software process engineers. Different kinds of informal and formal software process modeling languages are essential tools for process engineers. These are used to define processes in a way which allows easy management of processes, for example process dissemination, process tailoring and process enactment. The process modeling languages are usually used as a tool for process engineering where the main focus is on the processes themselves. This dissertation has a different emphasis. The dissertation analyses modern software development process modeling from the software developers’ point of view. The goal of the dissertation is to investigate whether the software process modeling and the software process models aid software developers in their day-to-day work and what are the main mechanisms for this. The focus of the work is on the Software Process Engineering Metamodel (SPEM) framework which is currently one of the most influential process modeling notations in software engineering. The research theme is elaborated through six scientific articles which represent the dissertation research done with process modeling during an approximately five year period. The research follows the classical engineering research discipline where the current situation is analyzed, a potentially better solution is developed and finally its implications are analyzed. The research applies a variety of different research techniques ranging from literature surveys to qualitative studies done amongst software practitioners. The key finding of the dissertation is that software process modeling notations and techniques are usually developed in process engineering terms. As a consequence the connection between the process models and actual development work is loose. In addition, the modeling standards like SPEM are partially incomplete when it comes to pragmatic process modeling needs, like light-weight modeling and combining pre-defined process components. This leads to a situation, where the full potential of process modeling techniques for aiding the daily development activities can not be achieved. Despite these difficulties the dissertation shows that it is possible to use modeling standards like SPEM to aid software developers in their work. The dissertation presents a light-weight modeling technique, which software development teams can use to quickly analyze their work practices in a more objective manner. The dissertation also shows how process modeling can be used to more easily compare different software development situations and to analyze their differences in a systematic way. Models also help to share this knowledge with others. A qualitative study done amongst Finnish software practitioners verifies the conclusions of other studies in the dissertation. Although processes and development methodologies are seen as an essential part of software development, the process modeling techniques are rarely used during the daily development work. However, the potential of these techniques intrigues the practitioners. As a conclusion the dissertation shows that process modeling techniques, most commonly used as tools for process engineers, can also be used as tools for organizing the daily software development work. This work presents theoretical solutions for bringing the process modeling closer to the ground-level software development activities. These theories are proven feasible by presenting several case studies where the modeling techniques are used e.g. to find differences in the work methods of the members of a software team and to share the process knowledge to a wider audience.
Resumo:
The purpose of this thesis is twofold. The first and major part is devoted to sensitivity analysis of various discrete optimization problems while the second part addresses methods applied for calculating measures of solution stability and solving multicriteria discrete optimization problems. Despite numerous approaches to stability analysis of discrete optimization problems two major directions can be single out: quantitative and qualitative. Qualitative sensitivity analysis is conducted for multicriteria discrete optimization problems with minisum, minimax and minimin partial criteria. The main results obtained here are necessary and sufficient conditions for different stability types of optimal solutions (or a set of optimal solutions) of the considered problems. Within the framework of quantitative direction various measures of solution stability are investigated. A formula for a quantitative characteristic called stability radius is obtained for the generalized equilibrium situation invariant to changes of game parameters in the case of the H¨older metric. Quality of the problem solution can also be described in terms of robustness analysis. In this work the concepts of accuracy and robustness tolerances are presented for a strategic game with a finite number of players where initial coefficients (costs) of linear payoff functions are subject to perturbations. Investigation of stability radius also aims to devise methods for its calculation. A new metaheuristic approach is derived for calculation of stability radius of an optimal solution to the shortest path problem. The main advantage of the developed method is that it can be potentially applicable for calculating stability radii of NP-hard problems. The last chapter of the thesis focuses on deriving innovative methods based on interactive optimization approach for solving multicriteria combinatorial optimization problems. The key idea of the proposed approach is to utilize a parameterized achievement scalarizing function for solution calculation and to direct interactive procedure by changing weighting coefficients of this function. In order to illustrate the introduced ideas a decision making process is simulated for three objective median location problem. The concepts, models, and ideas collected and analyzed in this thesis create a good and relevant grounds for developing more complicated and integrated models of postoptimal analysis and solving the most computationally challenging problems related to it.
Resumo:
Operating in business-to-business markets requires an in-depth understanding on business networks. Actions and reactions made to compete in markets are fundamentally based on managers‘ subjective perceptions of the network. However, an amalgamation of these individual perceptions, termed a network picture, to a common company level shared understanding on that network, known as network insight, is found to be a substantial challenge for companies. A company‘s capability to enhance common network insight is even argued to lead competitive advantage. Especially companies with value creating logics that require wide comprehension of and collaborating in networks, such as solution business, are necessitated to develop advanced network insight. According to the extant literature, dispersed pieces of atomized network pictures can be unified to a common network insight through a process of amalgamation that comprises barriers/drivers of multilateral exchange, manifold rationality, and recursive time. However, the extant body of literature appears to lack an understanding on the role of internal communication in the development of network insight. Nonetheless, the extant understanding on the amalgamation process indicates that internal communication plays a substantial role in the development of company level network insight. The purpose of the present thesis is to enhance understanding on internal communication in the amalgamation of network pictures to develop network insight in the solution business setting, which was chosen to represent business-to-business value creating logic that emphasizes the capability to understand and utilize networks. Thus, in solution business the role of succeeding in the amalgamation process is expected to emphasize. The study combines qualitative and quantitative research by means of various analytical methods including multiple case analysis, simulation, and social network analysis. Approaching the nascent research topic with differing perspectives and means provides a broader insight on the phenomenon. The study provides empirical evidence from Finnish business-to-business companies which operate globally. The empirical data comprise interviews (n=28) with managers of three case companies. In addition the data includes a questionnaire (n=23) collected mainly for the purpose of social network analysis. In addition, the thesis includes a simulation study more specifically achieved by means of agent based modeling. The findings of the thesis shed light on the role of internal communication in the amalgamation process, contributing to the emergent discussion of network insights and thus to the industrial marketing research. In addition, the thesis increases understanding on internal communication in the change process to solution business, a supplier‘s internal communication in its matrix organization structure during a project sales process, key barriers and drivers that influence internal communication in project sales networks, perceived power within industrial project sales, and the revisioning of network pictures. According to the findings, internal communication is found to play a substantial role in the amalgamation process. First, it is suggested that internal communication is a base of multilateral exchange. Second, it is suggested that internal communication intensifies and maintains manifold rationality. Third, internal communication is needed to explicate the usually differing time perspectives of others and thus it is suggested that internal communication has role as the explicator of recursive time. Furthermore, the role of an efficient amalgamation process is found to be emphasized in solutions business as it requires a more advanced network insight for cross-functional collaboration. Finally, the thesis offers several managerial implications for industrial suppliers to enhance the amalgamation process when operating in solution business.
Resumo:
The importance of efficient supply chain management has increased due to globalization and the blurring of organizational boundaries. Various supply chain management technologies have been identified to drive organizational profitability and financial performance. Organizations have historically been concentrating heavily on the flow of goods and services, while less attention has been dedicated to the flow of money. While supply chains are becoming more transparent and automated, new opportunities for financial supply chain management have emerged through information technology solutions and comprehensive financial supply chain management strategies. This research concentrates on the end part of the purchasing process which is the handling of invoices. Efficient invoice processing can have an impact on organizations working capital management and thus provide companies with better readiness to face the challenges related to cash management. Leveraging a process mining solution the aim of this research was to examine the automated invoice handling process of four different organizations. The invoice data was collected from each organizations invoice processing system. The sample included all the invoices organizations had processed during the year 2012. The main objective was to find out whether e-invoices are faster to process in an automated invoice processing solution than scanned invoices (post entry into invoice processing solution). Other objectives included looking into the longest lead times between process steps and the impact of manual process steps on cycle time. Processing of invoices from maverick purchases was also examined. Based on the results of the research and previous literature on the subject, suggestions for improving the process were proposed. The results of the research indicate that scanned invoices were processed faster than e-invoices. This is mostly due to the more complex processing of e-invoices. It should be noted however that the manual tasks related to turning a paper invoice into electronic format through scanning are ignored in this research. The transitions with the longest lead times in the invoice handling process included both pre-automated steps as well as manual steps performed by humans. When the most common manual steps were examined in more detail, it was clear that these steps had a prolonging impact on the process. Regarding invoices from maverick purchases the evidence shows that these invoices were slower to process than invoices from purchases conducted through e-procurement systems and from preferred suppliers. Suggestions on how to improve the process included: increasing invoice matching, reducing of manual steps and leveraging of different value added services such as invoice validation service, mobile solutions and supply chain financing services. For companies that have already reaped all the process efficiencies the next step is to engage in collaborative financial supply chain management strategies that can benefit the whole supply chain.
Resumo:
Process planning is a very important industrial activity, since it determines how a part or a product is manufactured. Process planning decisions include machine selection, tool selection, and cutting conditions determination, and thus it is a complex activity. In the presence of unstable demand, flexibility has become a very important characteristic of today's successful industries, for which Flexible Manufacturing Systems (FMSs) have been proposed as a solution. However, we believe that FMS control software is not flexible enough to adapt to different manufacturing system conditions aiming at increasing the system's efficiency. One means to overcome this limitation is to include pre-planned alternatives in the process plan; then planning decisions are made by the control system in real time to select the most appropriate alternative according to the conditions of the shop floor. Some of the advantages of this approach reported in the literature are the reduction of the number of tool setups, and the selection of a replacement machine for executing an operation. To verify whether the presence of alternatives in process plans actually increases the efficiency of the manufacturing system, an investigation was carried out using simulation and design of experiments techniques for alternative plans on a single machine. The proposed methodology and the results are discussed within this paper.