983 resultados para DESIGN BASIS ACCIDENTS
Resumo:
The purpose of this thesis is to analyse activity-based costing (ABC) and possible modified versions ofit in engineering design context. The design engineers need cost information attheir decision-making level and the cost information should also have a strong future orientation. These demands are high because traditional management accounting has concentrated on the direct actual costs of the products. However, cost accounting has progressed as ABC was introduced late 1980s and adopted widely bycompanies in the 1990s. The ABC has been a success, but it has gained also criticism. In some cases the ambitious ABC systems have become too complex to build,use and update. This study can be called an action-oriented case study with some normative features. In this thesis theoretical concepts are assessed and allowed to unfold gradually through interaction with data from three cases. The theoretical starting points are ABC and theory of engineering design process (chapter2). Concepts and research results from these theoretical approaches are summarized in two hypotheses (chapter 2.3). The hypotheses are analysed with two cases (chapter 3). After the two case analyses, the ABC part is extended to cover alsoother modern cost accounting methods, e.g. process costing and feature costing (chapter 4.1). The ideas from this second theoretical part are operationalized with the third case (chapter 4.2). The knowledge from the theory and three cases is summarized in the created framework (chapter 4.3). With the created frameworkit is possible to analyse ABC and its modifications in the engineering design context. The framework collects the factors that guide the choice of the costing method to be used in engineering design. It also illuminates the contents of various ABC-related costing methods. However, the framework needs to be further tested. On the basis of the three cases it can be said that ABC should be used cautiously when formulating cost information for engineering design. It is suitable when the manufacturing can be considered simple, or when the design engineers are not cost conscious, and in the beginning of the design process when doing adaptive or variant design. If the design engineers need cost information for the embodiment or detailed design, or if manufacturing can be considered complex, or when design engineers are cost conscious, the ABC has to be always evaluated critically.
Resumo:
It is generally accepted that between 70 and 80% of manufacturing costs can be attributed to design. Nevertheless, it is difficult for the designer to estimate manufacturing costs accurately, especially when alternative constructions are compared at the conceptual design phase, because of the lack of cost information and appropriate tools. In general, previous reports concerning optimisation of a welded structure have used the mass of the product as the basis for the cost comparison. However, it can easily be shown using a simple example that the use of product mass as the sole manufacturing cost estimator is unsatisfactory. This study describes a method of formulating welding time models for cost calculation, and presents the results of the models for particular sections, based on typical costs in Finland. This was achieved by collecting information concerning welded products from different companies. The data included 71 different welded assemblies taken from the mechanical engineering and construction industries. The welded assemblies contained in total 1 589 welded parts, 4 257 separate welds, and a total welded length of 3 188 metres. The data were modelled for statistical calculations, and models of welding time were derived by using linear regression analysis. Themodels were tested by using appropriate statistical methods, and were found to be accurate. General welding time models have been developed, valid for welding in Finland, as well as specific, more accurate models for particular companies. The models are presented in such a form that they can be used easily by a designer, enabling the cost calculation to be automated.
Resumo:
Elektroninen kaupankäynti ja pankkipalvelut ovat herättäneet toiminnan jatkuvuuden kannalta erittäin kriittisen kysymyksen siitä, kuinka näitä palveluja pystytään suojaamaan järjestäytynyttä rikollisuutta ja erilaisia hyväksikäyttöjä vastaan.
Resumo:
Työssä tutkittiin oliosuunnittelumalleja EPOC-käyttöjärjestelmässä. Työssä tutkittiin sekä yleisiä suunnittelumalleja että EPOC-ympäristössä esiintyviä oliorakenteita, niiden aiheuttamia vaatimuksia sovelluksille sekä niiden käyttämisestä saatavia hyötyjä. Työssä toteutettiin EPOC-ohjelmiston suunnittelu hyödyntäen suunnittelumalleja ja periaatteita. Oliosuunnittelumallit ovat yleistyneet huomattavasti viime vuosina. Suunnittelumallien lähtökohtana ovat sekä yleiset että ympäristökohtaiset suunnitteluperiaatteet ja säännöt. Suunnittelumallit ovat osa isompaa rakennekokonaisuutta, joka käsittää sekä prosessi-, analyysi-, arkkitehtuuri- ym. malleja. Oliosuunnittelumallit nopeuttavat ja helpottavat suunnittelua sekä parantavat uudelleenkäytettävyyttä korkeammalla abstraktiotasolla. EPOC on tulevaisuuden mobiililaitteiden yleisimpiä käyttöjärjestelmiä. EPOC on kokonaisuudessaan oliopohjainen ja sisältää lukuisia oliorakenteita, joiden ymmärtäminen on sovelluskehityksen kannalta elintärkeää. Koska ympäristöt, joissa EPOC-käyttöjärjestelmää käytetään, ovat yleensä resurssien puolesta rajoittuneita, on yleisten suunnittelumallien käytössä oltava tarkkana. EPOC vaatii yleisiin suunnittelumalleihin muutoksia ja estää joidenkin käytön kokonaan.
Resumo:
Työn tarkoituksena on selvittää, miten käyttötietämystä hyödynnetään prosessisuunnittelussa. Tavoitteena on löytää keinoja parantaa käyttötietämyksen hallintaa suunnitteluprosessin aikana ja selvittää, vaikuttaako tämä prosessisuunnittelun laatuun.Prosessisuunnittelun laatua arvioidaan seitsemällä kriteerillä, jotka ovat investointikustannukset, käyttökustannukset, turvallisuus, ympäristövaikutukset, käytettävyys, innovatiivisuus ja aikataulu. Suunnitteluprosessi jaetaan kolmeen vaiheeseen: esisuunnitteluun, perussuunnitteluun ja detaljisuunnitteluun. Prosessisuunnittelua, investointiprojektia, prosessisuunnittelun laatukriteerejä, suunnitteluprosessin eri vaiheita ja käyttötietämyksen luokittelua tarkastellaan yleisesti. Työssä selvitettiin käyttötietämyksen hyödyntämistä Kemiralla. Aluksi muotoiltiin yleisiä väittämiä käyttötietämyksen hyödyntämisestä Kemiran ulkopuolisten eri alojen asiantuntijoiden haastattelujen perusteella. Tämän jälkeen Kemiran prosessisuunnittelijat arvioivat väittämiä. Arvioiden perusteella tehtiin johtopäätöksiä yleisesti käyttötietämyksen hyödyntämisestä prosessisuunnittelussa. Seuraavaksi haastateltiin kahdessa erityyppisessä case-projektissa mukana olleita henkilöitä ja muotoiltiin yleiset väittämät näihin projekteihin sopiviksi. Projekteissa mukana olleet henkilöt arvioivat väittämiä, ja näiden arvioiden perusteella projekteja vertailtiin keskenään. Lopussa esitetään johtopäätökset kaikkien väittämien arvioiden perusteella. Johtopäätöksenä voidaan todeta, että käyttötietämystä voidaan hyödyntää kaikissa suunnittelun vaiheissa, mutta paras hyöty saadaan perus- ja detaljisuunnittelussa. Käyttötietämyksellä voidaan vaikuttaa joihinkin prosessisuunnittelun laatukriteereihin, kuten esimerkiksi käytettävyyteen ja turvallisuuteen enemmän kuin muihin. Kemiralle suositellaan nykyisten tiedonhallintamenetelmien kehittämistä, jotta käyttötietämyksen saatavuus ja sen siirtäminen paranisi. Pr
Resumo:
This thesis considers aspects related to the design and standardisation of transmission systems for wireless broadcasting, comprising terrestrial and mobile reception. The purpose is to identify which factors influence the technical decisions and what issues could be better considered in the design process in order to assess different use cases, service scenarios and end-user quality. Further, the necessity of cross-layer optimisation for efficient data transmission is emphasised and means to take this into consideration are suggested. The work is mainly related terrestrial and mobile digital video broadcasting systems but many of the findings can be generalised also to other transmission systems and design processes. The work has led to three main conclusions. First, it is discovered that there are no sufficiently accurate error criteria for measuring the subjective perceived audiovisual quality that could be utilised in transmission system design. Means for designing new error criteria for mobile TV (television) services are suggested and similar work related to other services is recommended. Second, it is suggested that in addition to commercial requirements there should be technical requirements setting the frame work for the design process of a new transmission system. The technical requirements should include the assessed reception conditions, technical quality of service and service functionalities. Reception conditions comprise radio channel models, receiver types and antenna types. Technical quality of service consists of bandwidth, timeliness and reliability. Of these, the thesis focuses on radio channel models and errorcriteria (reliability) as two of the most important design challenges and provides means to optimise transmission parameters based on these. Third, the thesis argues that the most favourable development for wireless broadcasting would be a single system suitable for all scenarios of wireless broadcasting. It is claimed that there are no major technical obstacles to achieve this and that the recently published second generation digital terrestrial television broadcasting system provides a good basis. The challenges and opportunities of a universal wireless broadcasting system are discussed mainly from technical but briefly also from commercial and regulatory aspect
Resumo:
Fatal and permanently disabling accidents form only one per I cent of all occupational accidents but in many branches of industry they account for more than half the accident costs. Furthermore the human suffering of the victim and his family is greater in severe accidents than in slight ones. For both human and economic reasons the severe accident risks should be identified befor injuries occur. It is for this purpose that different safety analysis methods have been developed . This study shows two new possible approaches to the problem.. The first is the hypothesis that it is possible to estimate the potential severity of accidents independent of the actual severity. The second is the hypothesis that when workers are also asked to report near accidents, they are particularly prone to report potentially severe near accidents on the basis of their own subjective risk assessment. A field study was carried out in a steel factory. The results supported both the hypotheses. The reliability and the validity of post incident estimates of an accident's potential severity were reasonable. About 10 % of accidents were estimated to be potentially critical; they could have led to death or very severe permanent disability. Reported near accidents were significantly more severe, about 60 $ of them were estimated to be critical. Furthermore the validity of workers subjective risk assessment, manifested in the near accident reports, proved to be reasonable. The studied new methods require further development and testing. They could be used both in routine usage in work places and in research for identifying and setting the priorities of accident risks.
Resumo:
The proposal to work on this final project came after several discussions held with Dr. Elzbieta Malinowski Gadja, who in 2008 published the book entitled Advanced Data Warehouse Design: From Conventional to Spatial and Temporal Applications (Data-Centric Systems and Applications). The project was carried out under the technical supervision of Dr. Malinowski and the direct beneficiary was the University of Costa Rica (UCR) where Dr. Malinowski is a professor at the Department of Computer Science and Informatics. The purpose of this project was twofold: First, to translate chapter III of said book with the intention of generating educational material for the use of the UCR and, second, to venture in the field of technical translation related to data warehouse. For the first component, the goal was to generate a final product that would eventually serve as an educational tool for the post-graduate courses of the UCR. For the second component, this project allowed me to acquire new skills and put into practice techniques that have helped me not only to perfom better in my current job as an Assistant Translator of the Inter-American BAnk (IDB), but also to use them in similar projects. The process was lenggthy and required torough research and constant communication with the author. The investigation focused on the search of terms and definitions to prepare the glossary, which was the basis to start the translation project. The translation process itself was carried out by phases, so that comments and corrections by the author could be taken into account in subsequent stages. Later, based on the glossary and the translated text, illustrations had been created in the Visio software were translated. In addition to the technical revision by the author, professor Carme Mangiron was in charge of revising the non-technical text. The result was a high-quality document that is currently used as reference and study material by the Department of Computer Science and Informatics of Costa Rica.
Resumo:
The objective of the pilotage effectiveness study was to come up with a process descrip-tion of the pilotage procedure, to design performance indicators based on this process description, to be used by Finnpilot, and to work out a preliminary plan for the imple-mentation of the indicators within the Finnpilot organisation. The theoretical aspects of pilotage as well as the guidelines and standards used were determined through a literature review. Based on the literature review, a process flow model with the following phases was created: the planning of pilotage, the start of pilo-tage, the act of pilotage, the end of pilotage and the closing of pilotage. The model based on the literature review was tested through interviews and observation of pilotage. At the same time an e-mail survey directed at foreign pilotage organisations, which included a questionnaire concerning their standards and management systems, operations procedures, measurement tools and their attitude to the passage planning, was conducted. The main issues in the observations and interviews were the passage plan and the bridge team co-operation. The phases of the pilotage process model emerged in both the pilotage activities and the interviews whereas bridge team co-operation was relatively marginal. Most of the pilotage organisations, who responded to the query, also use some standard-based management system. All organisations who answered the survey use some sort of a pilotage process model. According to the query, the main measuring tools for pilotage are statistical information concerning pilotage and the organisations, the customer feedback surveys, and financial results. Attitudes to-wards passage planning were mostly positive among the organisations. A workshop with pilotage experts was arranged where the process model constructed on the basis of the literature review was tuned to match practical pilotage. In the workshop it was determined that certain phases and the corresponding tasks, through which pilo-tage can be described as a process, were identifiable in all pilotage. The result of the workshop was a complemented process model, which separates incoming and outgoing traffic, as well as the fairway pilotage and harbour pilotage from each other. Addition-ally indicators divided according to the data gathering method were defined. Data con-cerning safety and traffic flow is gathered in the form of customer feedback. The pilot's own perceptions of the pilotage process are gathered through self-assessment. The measurement data which is connected to the phases of the pilotage process is generated e.g. by gathering statistics of the success of the pilot dispatches, the accuracy of the pi-lotage and the incidents that occurred during the pilotage, near misses, deviations and accidents. The measurement data is collected via the PilotWeb at the closing of the pilo-tage. A separate project and a project group with pilots also participating will be established for the deployment of the performance indicators. The phases of the project are: the definition phase, the implementation phase and the deployment phase. The purpose of the definition phase is to prepare questions for ship commanders concerning the cus-tomer feedback questionnaire and also to work out the self-assessment queries and the queries concerning the process indicators.
Resumo:
Parameters such as tolerance, scale and agility utilized in data sampling for using in Precision Agriculture required an expressive number of researches and development of techniques and instruments for automation. It is highlighted the employment of methodologies in remote sensing used in coupled to a Geographic Information System (GIS), adapted or developed for agricultural use. Aiming this, the application of Agricultural Mobile Robots is a strong tendency, mainly in the European Union, the USA and Japan. In Brazil, researches are necessary for the development of robotics platforms, serving as a basis for semi-autonomous and autonomous navigation systems. The aim of this work is to describe the project of an experimental platform for data acquisition in field for the study of the spatial variability and development of agricultural robotics technologies to operate in agricultural environments. The proposal is based on a systematization of scientific work to choose the design parameters utilized for the construction of the model. The kinematic study of the mechanical structure was made by the virtual prototyping process, based on modeling and simulating of the tension applied in frame, using the.
Resumo:
Filtration is a widely used unit operation in chemical engineering. The huge variation in the properties of materials to be ltered makes the study of ltration a challenging task. One of the objectives of this thesis was to show that conventional ltration theories are di cult to use when the system to be modelled contains all of the stages and features that are present in a complete solid/liquid separation process. Furthermore, most of the ltration theories require experimental work to be performed in order to obtain critical parameters required by the theoretical models. Creating a good overall understanding of how the variables a ect the nal product in ltration is somewhat impossible on a purely theoretical basis. The complexity of solid/liquid separation processes require experimental work and when tests are needed, it is advisable to use experimental design techniques so that the goals can be achieved. The statistical design of experiments provides the necessary tools for recognising the e ects of variables. It also helps to perform experimental work more economically. Design of experiments is a prerequisite for creating empirical models that can describe how the measured response is related to the changes in the values of the variable. A software package was developed that provides a ltration practitioner with experimental designs and calculates the parameters for linear regression models, along with the graphical representation of the responses. The developed software consists of two software modules. These modules are LTDoE and LTRead. The LTDoE module is used to create experimental designs for di erent lter types. The lter types considered in the software are automatic vertical pressure lter, double-sided vertical pressure lter, horizontal membrane lter press, vacuum belt lter and ceramic capillary action disc lter. It is also possible to create experimental designs for those cases where the variables are totally user de ned, say for a customized ltration cycle or di erent piece of equipment. The LTRead-module is used to read the experimental data gathered from the experiments, to analyse the data and to create models for each of the measured responses. Introducing the structure of the software more in detail and showing some of the practical applications is the main part of this thesis. This approach to the study of cake ltration processes, as presented in this thesis, has been shown to have good practical value when making ltration tests.
Resumo:
Filtration is a widely used unit operation in chemical engineering. The huge variation in the properties of materials to be ltered makes the study of ltration a challenging task. One of the objectives of this thesis was to show that conventional ltration theories are di cult to use when the system to be modelled contains all of the stages and features that are present in a complete solid/liquid separation process. Furthermore, most of the ltration theories require experimental work to be performed in order to obtain critical parameters required by the theoretical models. Creating a good overall understanding of how the variables a ect the nal product in ltration is somewhat impossible on a purely theoretical basis. The complexity of solid/liquid separation processes require experimental work and when tests are needed, it is advisable to use experimental design techniques so that the goals can be achieved. The statistical design of experiments provides the necessary tools for recognising the e ects of variables. It also helps to perform experimental work more economically. Design of experiments is a prerequisite for creating empirical models that can describe how the measured response is related to the changes in the values of the variable. A software package was developed that provides a ltration practitioner with experimental designs and calculates the parameters for linear regression models, along with the graphical representation of the responses. The developed software consists of two software modules. These modules are LTDoE and LTRead. The LTDoE module is used to create experimental designs for di erent lter types. The lter types considered in the software are automatic vertical pressure lter, double-sided vertical pressure lter, horizontal membrane lter press, vacuum belt lter and ceramic capillary action disc lter. It is also possible to create experimental designs for those cases where the variables are totally user de ned, say for a customized ltration cycle or di erent piece of equipment. The LTRead-module is used to read the experimental data gathered from the experiments, to analyse the data and to create models for each of the measured responses. Introducing the structure of the software more in detail and showing some of the practical applications is the main part of this thesis. This approach to the study of cake ltration processes, as presented in this thesis, has been shown to have good practical value when making ltration tests.
Resumo:
This study will concentrate on Product Data Management (PDM) systems, and sheet metal design features and classification. In this thesis, PDM is seen as an individual system which handles all product-related data and information. The meaning of relevant data is to take the manufacturing process further with fewer errors. The features of sheet metals are giving more information and value to the designed models. The possibility of implementing PDM and sheet metal features recognition are the core of this study. Their integration should make the design process faster and manufacturing-friendly products easier to design. The triangulation method is the basis for this research. The sections of this triangle are: scientific literature review, interview using the Delphi method and the author’s experience and observations. The main key findings of this study are: (1) the area of focus in triangle (the triangle of three different point of views: business, information exchange and technical) depends on the person’s background and their role in the company, (2) the classification in the PDM system (and also in the CAD system) should be done using the materials, tools and machines that are in use in the company and (3) the design process has to be more effective because of the increase of industrial production, sheet metal blank production and the designer’s time spent on actual design and (4) because Design For Manufacture (DFM) integration can be done with CAD-programs, DFM integration with the PDM system should also be possible.
Resumo:
Oligonucleotides have a wide range of applications in fields such as biotechnology, molecular biology, diagnosis and therapy. However, the spectrum of uses can be broadened by introducing chemical modifications into their structures. The most prolific field in the search for new oligonucleotide analogs is the antisense strategy, where chemical modifications confer appropriate characteristics such as hybridization, resistance to nucleases, cellular uptake, selectivity and, basically, good pharmacokinetic and pharmacodynamic properties. Combinatorial technology is another research area where oligonucleotides and their analogs are extensively employed. Aptamers, new catalytic ribozymes and deoxyribozymes are RNA or DNA molecules individualized from a randomly synthesized library on the basis of a particular property. They are identified by repeated cycles of selection and amplification, using PCR technologies. Modified nucleotides can be introduced either during the amplification procedure or after selection.
Resumo:
An electric system based on renewable energy faces challenges concerning the storage and utilization of energy due to the intermittent and seasonal nature of renewable energy sources. Wind and solar photovoltaic power productions are variable and difficult to predict, and thus electricity storage will be needed in the case of basic power production. Hydrogen’s energetic potential lies in its ability and versatility to store chemical energy, to serve as an energy carrier and as feedstock for various industries. Hydrogen is also used e.g. in the production of biofuels. The amount of energy produced during hydrogen combustion is higher than any other fuel’s on a mass basis with a higher-heating-value of 39.4 kWh/kg. However, even though hydrogen is the most abundant element in the universe, on Earth most hydrogen exists in molecular forms such as water. Therefore, hydrogen must be produced and there are various methods to do so. Today, the majority hydrogen comes from fossil fuels, mainly from steam methane reforming, and only about 4 % of global hydrogen comes from water electrolysis. Combination of electrolytic production of hydrogen from water and supply of renewable energy is attracting more interest due to the sustainability and the increased flexibility of the resulting energy system. The preferred option for intermittent hydrogen storage is pressurization in tanks since at ambient conditions the volumetric energy density of hydrogen is low, and pressurized tanks are efficient and affordable when the cycling rate is high. Pressurized hydrogen enables energy storage in larger capacities compared to battery technologies and additionally the energy can be stored for longer periods of time, on a time scale of months. In this thesis, the thermodynamics and electrochemistry associated with water electrolysis are described. The main water electrolysis technologies are presented with state-of-the-art specifications. Finally, a Power-to-Hydrogen infrastructure design for Lappeenranta University of Technology is presented. Laboratory setup for water electrolysis is specified and factors affecting its commissioning in Finland are presented.