908 resultados para systems design
Resumo:
Meeting the needs of both present and future generations forms the foundation of sustainable development. Concern about food demand is increasing alongside the continuously growing population. In the pursuit of food security preventing food waste is one solution avoiding the negative environmental impacts that result from producing food unnecessarily. Packages offer one answer to preventing food waste, as they 1) preserve and protect food, 2) introduce the user to the correct way to handle and use the food and package and 3) allow the user to consume the food in its entirety. This thesis aims to enhance the sustainability of food packages by giving special emphasis to preventing food waste. The focus of this thesis is to assist the packaging designer in being able to take into account the requirements for the sustainability of food packages and to be able to integrate these requirements into the product development process. In addition, life cycle methods that can be used as a tool in the packaging design process or in assessing the sustainability of finished food-packaging combinations are evaluated. The methods of life cycle costing (LCC) and life cycle working environment (LCWE) are briefly discussed. The method of life cycle assessment (LCA) is examined more thoroughly through the lens of the literature review of food-package LCA case studies published in the 21st century in three relevant journals. Based on this review and on experiences learned from conducting LCAs, recommendations are given as to how the LCA practitioner should conduct a food packaging study to make most of the results. Two case studies are presented in this thesis. The first case study relates the results of a life cycle assessment conducted for three food items (cold cut (ham), sliced dark bread (rye) and Soygurt drink) and the alternative packaging options of each. Results of this study show that the packaging constitutes only 1–12 % of the total environmental impacts of the food-packaging combination. The greatest effect is derived from the food itself and the wasted food. Even just a small percentage of wasted food causes more environmental impacts than does the packaging. The second case study presents the results of LCC and LCWE analysis done for fruit and vegetable transport packages. In this thesis, the specific results of the study itself are not the focus, but rather the study methods and scope are analysed based on how these complement the sustainability assessment of food packages. This thesis presents reasons why prevention of food waste should be more thoroughly taken into account in food packaging design. In addition, the task of the packaging designer is facilitated by the requirements of sustainable food packaging, by the methods and step-by-step guidance on how to integrate sustainability issues into the design process, and by the recommendations on how to assess the sustainability of food packages. The intention of this thesis is to express the issues that are important in the field of the food packaging industry. Having recognised and implemented these issues, businesses can better manage the risks that could follow from neglecting these sustainability aspects.
Resumo:
The significance of services as business and human activities has increased dramatically throughout the world in the last three decades. Becoming a more and more competitive and efficient service provider while still being able to provide unique value opportunities for customers requires new knowledge and ideas. Part of this knowledge is created and utilized in daily activities in every service organization, but not all of it, and therefore an emerging phenomenon in the service context is information awareness. Terms like big data and Internet of things are not only modern buzz-words but they are also describing urgent requirements for a new type of competences and solutions. When the amount of information increases and the systems processing information become more efficient and intelligent, it is the human understanding and objectives that may get separated from the automated processes and technological innovations. This is an important challenge and the core driver for this dissertation: What kind of information is created, possessed and utilized in the service context, and even more importantly, what information exists but is not acknowledged or used? In this dissertation the focus is on the relationship between service design and service operations. Reframing this relationship refers to viewing the service system from the architectural perspective. The selected perspective allows analysing the relationship between design activities and operational activities as an information system while maintaining the tight connection to existing service research contributions and approaches. This type of an innovative approach is supported by research methodology that relies on design science theory. The methodological process supports the construction of a new design artifact based on existing theoretical knowledge, creation of new innovations and testing the design artifact components in real service contexts. The relationship between design and operations is analysed in the health care and social care service systems. The existing contributions in service research tend to abstract services and service systems as value creation, working or interactive systems. This dissertation adds an important information processing system perspective to the research. The main contribution focuses on the following argument: Only part of the service information system is automated and computerized, whereas a significant part of information processing is embedded in human activities, communication and ad-hoc reactions. The results indicate that the relationship between service design and service operations is more complex and dynamic than the existing scientific and managerial models tend to view it. Both activities create, utilize, mix and share information, making service information management a necessary but relatively unknown managerial task. On the architectural level, service system -specific elements seem to disappear, but access to more general information elements and processes can be found. While this dissertation focuses on conceptual-level design artifact construction, the results provide also very practical implications for service providers. Personal, visual and hidden activities of service, and more importantly all changes that take place in any service system have also an information dimension. Making this information dimension visual and prioritizing the processed information based on service dimensions is likely to provide new opportunities to increase activities and provide a new type of service potential for customers.
Resumo:
Advancements in IC processing technology has led to the innovation and growth happening in the consumer electronics sector and the evolution of the IT infrastructure supporting this exponential growth. One of the most difficult obstacles to this growth is the removal of large amount of heatgenerated by the processing and communicating nodes on the system. The scaling down of technology and the increase in power density is posing a direct and consequential effect on the rise in temperature. This has resulted in the increase in cooling budgets, and affects both the life-time reliability and performance of the system. Hence, reducing on-chip temperatures has become a major design concern for modern microprocessors. This dissertation addresses the thermal challenges at different levels for both 2D planer and 3D stacked systems. It proposes a self-timed thermal monitoring strategy based on the liberal use of on-chip thermal sensors. This makes use of noise variation tolerant and leakage current based thermal sensing for monitoring purposes. In order to study thermal management issues from early design stages, accurate thermal modeling and analysis at design time is essential. In this regard, spatial temperature profile of the global Cu nanowire for on-chip interconnects has been analyzed. It presents a 3D thermal model of a multicore system in order to investigate the effects of hotspots and the placement of silicon die layers, on the thermal performance of a modern ip-chip package. For a 3D stacked system, the primary design goal is to maximise the performance within the given power and thermal envelopes. Hence, a thermally efficient routing strategy for 3D NoC-Bus hybrid architectures has been proposed to mitigate on-chip temperatures by herding most of the switching activity to the die which is closer to heat sink. Finally, an exploration of various thermal-aware placement approaches for both the 2D and 3D stacked systems has been presented. Various thermal models have been developed and thermal control metrics have been extracted. An efficient thermal-aware application mapping algorithm for a 2D NoC has been presented. It has been shown that the proposed mapping algorithm reduces the effective area reeling under high temperatures when compared to the state of the art.
Resumo:
The Swedish public health care organisation could very well be undergoing its most significant change since its specialisation during the late 19th and early 20th century. At the heart of this change is a move from using manual patient journals to electronic health records (EHR). EHR are complex integrated organisational wide information systems (IS) that promise great benefits and value as well as presenting great challenges to the organisation. The Swedish public health care is not the first organisation to implement integrated IS, and by no means alone in their quest for realising the potential benefits and value that it has to offer. As organisations invest in IS they embark on a journey of value-creation and capture. A journey where a costbased approach towards their IS-investments is replaced with a value-centric focus, and where the main challenges lie in the practical day-to-day task of finding ways to intertwine technology, people and business processes. This has however proven to be a problematic task. The problematic situation arises from a shift of perspective regarding how to manage IS in order to gain value. This is a shift from technology delivery to benefits delivery; from an ISimplementation plan to a change management plan. The shift gives rise to challenges related to the inability of IS and the elusiveness of value. As a response to these challenges the field of IS-benefits management has emerged offering a framework and a process in order to better understand and formalise benefits realisation activities. In this thesis the benefits realisation efforts of three Swedish hospitals within the same county council are studied. The thesis focuses on the participants of benefits analysis projects; their perceptions, judgments, negotiations and descriptions of potential benefits. The purpose is to address the process where organisations seek to identify which potential IS-benefits to pursue and realise, this in order to better understand what affects the process, so that realisation actions of potential IS-benefits could be supported. A qualitative case study research design is adopted and provides a framework for sample selection, data collection, and data analysis. It also provides a framework for discussions of validity, reliability and generalizability. Findings displayed a benefits fluctuation, which showed that participants’ perception of what constituted potential benefits and value changed throughout the formal benefits management process. Issues like structure, knowledge, expectation and experience affected perception differently, and this in the end changed the amount and composition of potential benefits and value. Five dimensions of benefits judgment were identified and used by participants when finding accommodations of potential benefits and value to pursue. Identified dimensions affected participants’ perceptions, which in turn affected the amount and composition of potential benefits. During the formal benefits management process participants shifted between judgment dimensions. These movements emerged through debates and interactions between participants. Judgments based on what was perceived as expected due to one’s role and perceived best for the organisation as a whole were the two dominant benefits judgment dimensions. A benefits negotiation was identified. Negotiations were divided into two main categories, rational and irrational, depending on participants’ drive when initiating and participating in negotiations. In each category three different types of negotiations were identified having different characteristics and generating different outcomes. There was also a benefits negotiation process identified that displayed management challenges corresponding to its five phases. A discrepancy was also found between how IS-benefits are spoken of and how actions of IS benefits realisation are understood. This was a discrepancy between an evaluation and a realisation focus towards IS value creation. An evaluation focus described IS-benefits as well-defined and measurable effects and a realisation focus spoke of establishing and managing an on-going place of value creation. The notion of valuescape was introduced in order to describe and support the understanding of IS value creation. Valuescape corresponded to a realisation focus and outlined a value configuration consisting of activities, logic, structure, drivers and role of IS.
Resumo:
The Laboratory of Intelligent Machine researches and develops energy-efficient power transmissions and automation for mobile construction machines and industrial processes. The laboratory's particular areas of expertise include mechatronic machine design using virtual technologies and simulators and demanding industrial robotics. The laboratory has collaborated extensively with industrial actors and it has participated in significant international research projects, particularly in the field of robotics. For years, dSPACE tools were the lonely hardware which was used in the lab to develop different control algorithms in real-time. dSPACE's hardware systems are in widespread use in the automotive industry and are also employed in drives, aerospace, and industrial automation. But new competitors are developing new sophisticated systems and their features convinced the laboratory to test new products. One of these competitors is National Instrument (NI). In order to get to know the specifications and capabilities of NI tools, an agreement was made to test a NI evolutionary system. This system is used to control a 1-D hydraulic slider. The objective of this research project is to develop a control scheme for the teleoperation of a hydraulically driven manipulator, and to implement a control algorithm between human and machine interaction, and machine and task environment interaction both on NI and dSPACE systems simultaneously and to compare the results.
Resumo:
A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.
Resumo:
This study will concentrate on Product Data Management (PDM) systems, and sheet metal design features and classification. In this thesis, PDM is seen as an individual system which handles all product-related data and information. The meaning of relevant data is to take the manufacturing process further with fewer errors. The features of sheet metals are giving more information and value to the designed models. The possibility of implementing PDM and sheet metal features recognition are the core of this study. Their integration should make the design process faster and manufacturing-friendly products easier to design. The triangulation method is the basis for this research. The sections of this triangle are: scientific literature review, interview using the Delphi method and the author’s experience and observations. The main key findings of this study are: (1) the area of focus in triangle (the triangle of three different point of views: business, information exchange and technical) depends on the person’s background and their role in the company, (2) the classification in the PDM system (and also in the CAD system) should be done using the materials, tools and machines that are in use in the company and (3) the design process has to be more effective because of the increase of industrial production, sheet metal blank production and the designer’s time spent on actual design and (4) because Design For Manufacture (DFM) integration can be done with CAD-programs, DFM integration with the PDM system should also be possible.
Resumo:
Cotton is highly susceptible to the interference imposed by weed community, being therefore essential to adopt control measures ensuring the crop yield. Herbicides are the primary method of weed control in large-scale areas of production, and usually more than one herbicide application is necessary due to the extensive crop cycle. This study aimed to evaluate the selectivity of different chemical weed control systems for conventional cotton. The experiment took place in the field in a randomized block design, with twenty nine treatments and four replications in a split plot layout (adjacent double check). Results showed that triple mixtures in pre-emergence increased the chance of observing reductions in the cotton yield. To avoid reductions in crop yield, users should proceed to a maximum mixture of two herbicides in pre-emergence, followed by S-metolachlor over the top, followed by one post-emergence mixture application of pyrithiobac-sodium + trifloxysulfuron-sodium.
Influence of surface functionalization on the behavior of silica nanoparticles in biological systems
Resumo:
Personalized nanomedicine has been shown to provide advantages over traditional clinical imaging, diagnosis, and conventional medical treatment. Using nanoparticles can enhance and clarify the clinical targeting and imaging, and lead them exactly to the place in the body that is the goal of treatment. At the same time, one can reduce the side effects that usually occur in the parts of the body that are not targets for treatment. Nanoparticles are of a size that can penetrate into cells. Their surface functionalization offers a way to increase their sensitivity when detecting target molecules. In addition, it increases the potential for flexibility in particle design, their therapeutic function, and variation possibilities in diagnostics. Mesoporous nanoparticles of amorphous silica have attractive physical and chemical characteristics such as particle morphology, controllable pore size, and high surface area and pore volume. Additionally, the surface functionalization of silica nanoparticles is relatively straightforward, which enables optimization of the interaction between the particles and the biological system. The main goal of this study was to prepare traceable and targetable silica nanoparticles for medical applications with a special focus on particle dispersion stability, biocompatibility, and targeting capabilities. Nanoparticle properties are highly particle-size dependent and a good dispersion stability is a prerequisite for active therapeutic and diagnostic agents. In the study it was shown that traceable streptavidin-conjugated silica nanoparticles which exhibit a good dispersibility could be obtained by the suitable choice of a proper surface functionalization route. Theranostic nanoparticles should exhibit sufficient hydrolytic stability to effectively carry the medicine to the target cells after which they should disintegrate and dissolve. Furthermore, the surface groups should stay at the particle surface until the particle has been internalized by the cell in order to optimize cell specificity. Model particles with fluorescently-labeled regions were tested in vitro using light microscopy and image processing technology, which allowed a detailed study of the disintegration and dissolution process. The study showed that nanoparticles degrade more slowly outside, as compared to inside the cell. The main advantage of theranostic agents is their successful targeting in vitro and in vivo. Non-porous nanoparticles using monoclonal antibodies as guiding ligands were tested in vitro in order to follow their targeting ability and internalization. In addition to the targeting that was found successful, a specific internalization route for the particles could be detected. In the last part of the study, the objective was to clarify the feasibility of traceable mesoporous silica nanoparticles, loaded with a hydrophobic cancer drug, being applied for targeted drug delivery in vitro and in vivo. Particles were provided with a small molecular targeting ligand. In the study a significantly higher therapeutic effect could be achieved with nanoparticles compared to free drug. The nanoparticles were biocompatible and stayed in the tumor for a longer time than a free medicine did, before being eliminated by renal excretion. Overall, the results showed that mesoporous silica nanoparticles are biocompatible, biodegradable drug carriers and that cell specificity can be achieved both in vitro and in vivo.
Resumo:
Technological innovations, the development of the internet, and globalization have increased the number and complexity of web applications. As a result, keeping web user interfaces understandable and usable (in terms of ease-of-use, effectiveness, and satisfaction) is a challenge. As part of this, designing userintuitive interface signs (i.e., the small elements of web user interface, e.g., navigational link, command buttons, icons, small images, thumbnails, etc.) is an issue for designers. Interface signs are key elements of web user interfaces because ‘interface signs’ act as a communication artefact to convey web content and system functionality, and because users interact with systems by means of interface signs. In the light of the above, applying semiotic (i.e., the study of signs) concepts on web interface signs will contribute to discover new and important perspectives on web user interface design and evaluation. The thesis mainly focuses on web interface signs and uses the theory of semiotic as a background theory. The underlying aim of this thesis is to provide valuable insights to design and evaluate web user interfaces from a semiotic perspective in order to improve overall web usability. The fundamental research question is formulated as What do practitioners and researchers need to be aware of from a semiotic perspective when designing or evaluating web user interfaces to improve web usability? From a methodological perspective, the thesis follows a design science research (DSR) approach. A systematic literature review and six empirical studies are carried out in this thesis. The empirical studies are carried out with a total of 74 participants in Finland. The steps of a design science research process are followed while the studies were designed and conducted; that includes (a) problem identification and motivation, (b) definition of objectives of a solution, (c) design and development, (d) demonstration, (e) evaluation, and (f) communication. The data is collected using observations in a usability testing lab, by analytical (expert) inspection, with questionnaires, and in structured and semi-structured interviews. User behaviour analysis, qualitative analysis and statistics are used to analyze the study data. The results are summarized as follows and have lead to the following contributions. Firstly, the results present the current status of semiotic research in UI design and evaluation and highlight the importance of considering semiotic concepts in UI design and evaluation. Secondly, the thesis explores interface sign ontologies (i.e., sets of concepts and skills that a user should know to interpret the meaning of interface signs) by providing a set of ontologies used to interpret the meaning of interface signs, and by providing a set of features related to ontology mapping in interpreting the meaning of interface signs. Thirdly, the thesis explores the value of integrating semiotic concepts in usability testing. Fourthly, the thesis proposes a semiotic framework (Semiotic Interface sign Design and Evaluation – SIDE) for interface sign design and evaluation in order to make them intuitive for end users and to improve web usability. The SIDE framework includes a set of determinants and attributes of user-intuitive interface signs, and a set of semiotic heuristics to design and evaluate interface signs. Finally, the thesis assesses (a) the quality of the SIDE framework in terms of performance metrics (e.g., thoroughness, validity, effectiveness, reliability, etc.) and (b) the contributions of the SIDE framework from the evaluators’ perspective.
Resumo:
Liposomes (lipid-based vesicles) have been widely studied as drug delivery systems due to their relative safety, their structural versatility concerning size, composition and bilayer fluidity, and their ability to incorporate almost any molecule regardless of its structure. Liposomes are successful in inducing potent in vivo immunity to incorporated antigens and are now being employed in numerous immunization procedures. This is a brief overview of the structural, biophysical and pharmacological properties of liposomes and of the current strategies in the design of liposomes as vaccine delivery systems.
Resumo:
Preparative liquid chromatography is one of the most selective separation techniques in the fine chemical, pharmaceutical, and food industries. Several process concepts have been developed and applied for improving the performance of classical batch chromatography. The most powerful approaches include various single-column recycling schemes, counter-current and cross-current multi-column setups, and hybrid processes where chromatography is coupled with other unit operations such as crystallization, chemical reactor, and/or solvent removal unit. To fully utilize the potential of stand-alone and integrated chromatographic processes, efficient methods for selecting the best process alternative as well as optimal operating conditions are needed. In this thesis, a unified method is developed for analysis and design of the following singlecolumn fixed bed processes and corresponding cross-current schemes: (1) batch chromatography, (2) batch chromatography with an integrated solvent removal unit, (3) mixed-recycle steady state recycling chromatography (SSR), and (4) mixed-recycle steady state recycling chromatography with solvent removal from fresh feed, recycle fraction, or column feed (SSR–SR). The method is based on the equilibrium theory of chromatography with an assumption of negligible mass transfer resistance and axial dispersion. The design criteria are given in general, dimensionless form that is formally analogous to that applied widely in the so called triangle theory of counter-current multi-column chromatography. Analytical design equations are derived for binary systems that follow competitive Langmuir adsorption isotherm model. For this purpose, the existing analytic solution of the ideal model of chromatography for binary Langmuir mixtures is completed by deriving missing explicit equations for the height and location of the pure first component shock in the case of a small feed pulse. It is thus shown that the entire chromatographic cycle at the column outlet can be expressed in closed-form. The developed design method allows predicting the feasible range of operating parameters that lead to desired product purities. It can be applied for the calculation of first estimates of optimal operating conditions, the analysis of process robustness, and the early-stage evaluation of different process alternatives. The design method is utilized to analyse the possibility to enhance the performance of conventional SSR chromatography by integrating it with a solvent removal unit. It is shown that the amount of fresh feed processed during a chromatographic cycle and thus the productivity of SSR process can be improved by removing solvent. The maximum solvent removal capacity depends on the location of the solvent removal unit and the physical solvent removal constraints, such as solubility, viscosity, and/or osmotic pressure limits. Usually, the most flexible option is to remove solvent from the column feed. Applicability of the equilibrium design for real, non-ideal separation problems is evaluated by means of numerical simulations. Due to assumption of infinite column efficiency, the developed design method is most applicable for high performance systems where thermodynamic effects are predominant, while significant deviations are observed under highly non-ideal conditions. The findings based on the equilibrium theory are applied to develop a shortcut approach for the design of chromatographic separation processes under strongly non-ideal conditions with significant dispersive effects. The method is based on a simple procedure applied to a single conventional chromatogram. Applicability of the approach for the design of batch and counter-current simulated moving bed processes is evaluated with case studies. It is shown that the shortcut approach works the better the higher the column efficiency and the lower the purity constraints are.
Resumo:
Environmental threats are growing nowadays, they became global issues. People around the world try to face these issues by two means: solving the current affected environs and preventing non-affected environs. This thesis describes the design, implementation, and evaluation of online water quality monitoring system in Lake Saimaa, Finland. The water quality in Lake Saimaa needs to be monitored in order to provide responsible bodies with valuable information which allows them to act fast in order to prevent any negative impact on the lake's environment. The objectives were to design a suitable system, implement the system in Lake Saimaa, and then to evaluate the applicability and reliability of such systems for this environment. The needs for the system were first isolated, and then the design, needed modifications, and the construction of the system took place. After that was the testing of the system in Lake Saimaa in two locations nearby Mikkeli city. The last step was to evaluate the whole system. The main results were that the application of online water quality monitoring systems in Lake Saimaa can benefit of many advantages such as reducing the required manpower, time and running costs. However, the point of unreliability of the exact measured values of some parameters is still the drawback of such systems which can be developed by using more advanced equipments with more sophisticated features specifically for the purpose of monitoring in the predefined location.
Resumo:
Nowadays, computer-based systems tend to become more complex and control increasingly critical functions affecting different areas of human activities. Failures of such systems might result in loss of human lives as well as significant damage to the environment. Therefore, their safety needs to be ensured. However, the development of safety-critical systems is not a trivial exercise. Hence, to preclude design faults and guarantee the desired behaviour, different industrial standards prescribe the use of rigorous techniques for development and verification of such systems. The more critical the system is, the more rigorous approach should be undertaken. To ensure safety of a critical computer-based system, satisfaction of the safety requirements imposed on this system should be demonstrated. This task involves a number of activities. In particular, a set of the safety requirements is usually derived by conducting various safety analysis techniques. Strong assurance that the system satisfies the safety requirements can be provided by formal methods, i.e., mathematically-based techniques. At the same time, the evidence that the system under consideration meets the imposed safety requirements might be demonstrated by constructing safety cases. However, the overall safety assurance process of critical computerbased systems remains insufficiently defined due to the following reasons. Firstly, there are semantic differences between safety requirements and formal models. Informally represented safety requirements should be translated into the underlying formal language to enable further veri cation. Secondly, the development of formal models of complex systems can be labour-intensive and time consuming. Thirdly, there are only a few well-defined methods for integration of formal verification results into safety cases. This thesis proposes an integrated approach to the rigorous development and verification of safety-critical systems that (1) facilitates elicitation of safety requirements and their incorporation into formal models, (2) simplifies formal modelling and verification by proposing specification and refinement patterns, and (3) assists in the construction of safety cases from the artefacts generated by formal reasoning. Our chosen formal framework is Event-B. It allows us to tackle the complexity of safety-critical systems as well as to structure safety requirements by applying abstraction and stepwise refinement. The Rodin platform, a tool supporting Event-B, assists in automatic model transformations and proof-based verification of the desired system properties. The proposed approach has been validated by several case studies from different application domains.
Resumo:
In this work the separation of multicomponent mixtures in counter-current columns with supercritical carbon dioxide has been investigated using a process design methodology. First the separation task must be defined, then phase equilibria experiments are carried out, and the data obtained are correlated with thermodynamic models or empirical functions. Mutual solubilities, Ki-values, and separation factors aij are determined. Based on this data possible operating conditions for further extraction experiments can be determined. Separation analysis using graphical methods are performed to optimize the process parameters. Hydrodynamic experiments are carried out to determine the flow capacity diagram. Extraction experiments in laboratory scale are planned and carried out in order to determine HETP values, to validate the simulation results, and to provide new materials for additional phase equilibria experiments, needed to determine the dependence of separation factors on concetration. Numerical simulation of the separation process and auxiliary systems is carried out to optimize the number of stages, solvent-to-feed ratio, product purity, yield, and energy consumption. Scale-up and cost analysis close the process design. The separation of palmitic acid and (oleic+linoleic) acids from PFAD-Palm Fatty Acids Distillates was used as a case study.