944 resultados para Development tools
Resumo:
Microreactors have proven to be versatile tools for process intensification. Over recent decades, they have increasingly been used for product and process development in chemical industries. Enhanced heat and mass transfer in the reactors due to the extremely high surfacearea- to-volume ratio and interfacial area allow chemical processes to be operated at extreme conditions. Safety is improved by the small holdup volume of the reactors and effective control of pressure and temperature. Hydrogen peroxide is a powerful green oxidant that is used in a wide range of industries. Reduction and auto-oxidation of anthraquinones is currently the main process for hydrogen peroxide production. Direct synthesis is a green alternative and has potential for on-site production. However, there are two limitations: safety concerns because of the explosive gas mixture produced and low selectivity of the process. The aim of this thesis was to develop a process for direct synthesis of hydrogen peroxide utilizing microreactor technology. Experimental and numerical approaches were applied for development of the microreactor. Development of a novel microreactor was commenced by studying the hydrodynamics and mass transfer in prototype microreactor plates. The prototypes were designed and fabricated with the assistance of CFD modeling to optimize the shape and size of the microstructure. Empirical correlations for the mass transfer coefficient were derived. The pressure drop in micro T-mixers was investigated experimentally and numerically. Correlations describing the friction factor for different flow regimes were developed and predicted values were in good agreement with experimental results. Experimental studies were conducted to develop a highly active and selective catalyst with a proper form for the microreactor. Pd catalysts supported on activated carbon cloths were prepared by different treatments during the catalyst preparation. A variety of characterization methods were used for catalyst investigation. The surface chemistry of the support and the oxidation state of the metallic phase in the catalyst play important roles in catalyst activity and selectivity for the direct synthesis. The direct synthesis of hydrogen peroxide was investigated in a bench-scale continuous process using the novel microreactor developed. The microreactor was fabricated based on the hydrodynamic and mass transfer studies and provided a high interfacial area and high mass transfer coefficient. The catalysts were prepared under optimum treatment conditions. The direct synthesis was conducted at various conditions. The thesis represents a step towards a commercially viable direct synthesis. The focus is on the two main challenges: mitigating the safety problem by utilization of microprocess technology and improving the selectivity by catalyst development.
Resumo:
Because of the increased availability of different kind of business intelligence technologies and tools it can be easy to fall in illusion that new technologies will automatically solve the problems of data management and reporting of the company. The management is not only about management of technology but also the management of processes and people. This thesis is focusing more into traditional data management and performance management of production processes which both can be seen as a requirement for long lasting development. Also some of the operative BI solutions are considered in the ideal state of reporting system. The objectives of this study are to examine what requirements effective performance management of production processes have for data management and reporting of the company and to see how they are effecting on the efficiency of it. The research is executed as a theoretical literary research about the subjects and as a qualitative case study about reporting development project of Finnsugar Ltd. The case study is examined through theoretical frameworks and by the active participant observation. To get a better picture about the ideal state of reporting system simple investment calculations are performed. According to the results of the research, requirements for effective performance management of production processes are automation in the collection of data, integration of operative databases, usage of efficient data management technologies like ETL (Extract, Transform, Load) processes, data warehouse (DW) and Online Analytical Processing (OLAP) and efficient management of processes, data and roles.
Resumo:
Tämä työ tehtiin Kone Industrial Oy:lle Major Projects yksikköön, laatuosastolle. Kone Major Projects yksikkö keskittyy erikoisiin ja suuriin hissi- ja liukuporras projekteihin. Työn tavoitteena oli luoda harmonisoitu prosessi hissikomponenttien laaduntarkkailua varten sekä tarkastella ja vertailla kustannussäästöjä, jota tällä uudella prosessilla voidaan saavuttaa. Tavoitteena oli saavuttaa 80-prosentin kustannussäästöt laatukustannuksissa uuden laatuprosessin avulla. Työn taustana ja tutkimusongelmana ovat lisääntyneet erikoisprojektit ja niiden myötä lisääntynyt laaduntarkkailun tarve. Ongelmana laaduntarkkailussa voitiin pitää harmonisoidun ja selkeän prosessin puuttumista C-prosessikomponenttien valmistuksessa. Lisäksi kehitysprosessin aikana luotiin vanhojen työkalujen pohjalta keskeinen laaduntarkkailutyökalu, CTQ-työkalu. Työssä käsitellään ensin Konetta yhtiönä ja selvitetään Koneen keskeisimmät prosessit työn taustaksi. Teoria osuudessa käsitellään prosessin kehitykseen liittyviä teorioita sekä yleisiä laatukäsitteitä ja esitetään teorioita laadun asemasta nykypäivänä. Lopuksi käsitellään COQ eli laatukustannusten teoriaa ja esitellään teoria PAF-analyysille, jota käytetään työssä laatukustannusten vertailuun case esimerkin avulla. Työssä kuvataan CTQ prosessin luominen alusta loppuun ja case esimerkin avulla testataan uutta CTQ prosessia pilottihankkeessa. Tässä case esimerkissä projektin bracket eli johdekiinnitysklipsi tuotetaan uuden laatuprosessin avulla sekä tehdään kustannusvertailu saman projektin toisen bracketin kanssa, joka on tuotettu ennen uuden laatuprosessin implementoimista. Työn lopputuloksena CTQ prosessi saatiin luotua ja sitä pystyttiin testaamaan käytännössä case esimerkin avulla. Tulosten perusteella voidaan sanoa, että CTQ prosessin käyttö vähentää laatukustannuksia huomattavasti ja helpottaa laadunhallintaa C-prosessikomponenttien tuotannossa.
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Concentrated solar power (CSP) is a renewable energy technology, which could contribute to overcoming global problems related to pollution emissions and increasing energy demand. CSP utilizes solar irradiation, which is a variable source of energy. In order to utilize CSP technology in energy production and reliably operate a solar field including thermal energy storage system, dynamic simulation tools are needed in order to study the dynamics of the solar field, to optimize production and develop control systems. The object of this Master’s Thesis is to compare different concentrated solar power technologies and configure a dynamic solar field model of one selected CSP field design in the dynamic simulation program Apros, owned by VTT and Fortum. The configured model is based on German Novatec Solar’s linear Fresnel reflector design. Solar collector components including dimensions and performance calculation were developed, as well as a simple solar field control system. The preliminary simulation results of two simulation cases under clear sky conditions were good; the desired and stable superheated steam conditions were maintained in both cases, while, as expected, the amount of steam produced was reduced in the case having lower irradiation conditions. As a result of the model development process, it can be concluded, that the configured model is working successfully and that Apros is a very capable and flexible tool for configuring new solar field models and control systems and simulating solar field dynamic behaviour.
Resumo:
Press forming is nowadays one of the most common industrial methods in use for producing deeper trays from paperboard. Demands for material properties like recyclability and sustainability have increased also in the packaging industry, but there are still limitations related to the formability of paperboard. A majority of recent studies have focused on material development, but the potential of the package manufacturing process can also be improved by the development of tooling and process control. In this study, advanced converting tools (die cutting tools and the press forming mould) are created for production scale paperboard tray manufacturing. Also monitoring methods that enable the production of paperboard trays with enhanced quality, and can be utilized in process control are developed. The principles for tray blank preparation, including creasing pattern and die cutting tool design are introduced. The mould heating arrangement and determination of mould clearance are investigated to improve the quality of the press formed trays. The effect of the spring back of the tray walls on the tray dimensions can be managed by adjusting the heat-related process parameters and estimating it at the mould design stage. This enables production speed optimization as the process parameters can be adjusted more freely. Real-time monitoring of pressing force by using multiple force sensors embedded in the mould structure can be utilized in the evaluation of material characteristics on a modified production machinery. Comprehensive process control can be achieved with a combination of measurement of the outer dimensions of the trays and pressing force monitoring. The control method enables detection of defects and tracking changes in the material properties. The optimized converting tools provide a basis for effective operation of the control system.
Resumo:
The recent rapid development of biotechnological approaches has enabled the production of large whole genome level biological data sets. In order to handle thesedata sets, reliable and efficient automated tools and methods for data processingand result interpretation are required. Bioinformatics, as the field of studying andprocessing biological data, tries to answer this need by combining methods and approaches across computer science, statistics, mathematics and engineering to studyand process biological data. The need is also increasing for tools that can be used by the biological researchers themselves who may not have a strong statistical or computational background, which requires creating tools and pipelines with intuitive user interfaces, robust analysis workflows and strong emphasis on result reportingand visualization. Within this thesis, several data analysis tools and methods have been developed for analyzing high-throughput biological data sets. These approaches, coveringseveral aspects of high-throughput data analysis, are specifically aimed for gene expression and genotyping data although in principle they are suitable for analyzing other data types as well. Coherent handling of the data across the various data analysis steps is highly important in order to ensure robust and reliable results. Thus,robust data analysis workflows are also described, putting the developed tools andmethods into a wider context. The choice of the correct analysis method may also depend on the properties of the specific data setandthereforeguidelinesforchoosing an optimal method are given. The data analysis tools, methods and workflows developed within this thesis have been applied to several research studies, of which two representative examplesare included in the thesis. The first study focuses on spermatogenesis in murinetestis and the second one examines cell lineage specification in mouse embryonicstem cells.
Resumo:
The Chinese welding industry is growing every year due to rapid development of the Chinese economy. Increasingly, companies around the world are looking to use Chinese enterprises as their cooperation partners. However, the Chinese welding industry also has its weaknesses, such as relatively low quality and weak management. A modern, advanced welding management system appropriate for local socio-economic conditions is required to enable Chinese enterprises to enhance further their business development. The thesis researches the design and implementation of a new welding quality management system for China. This new system is called ‗welding production quality control management model in China‘ (WQMC). Constructed on the basis of analysis of a survey and in-company interviews, the welding management system comprises the following different elements and perspectives: a ‗Localized congenital existing problem resolution strategies‘ (LCEPRS) database, a ‗human factor designed training system‘ (HFDT) training strategy, the theory of modular design, ISO 3834 requirements, total welding management (TWM), and lean manufacturing (LEAN) theory. The methods used in the research are literature review, questionnaires, interviews, and the author‘s model design experiences and observations, i.e. the approach is primarily qualitative and phenomenological. The thesis describes the design and implementation of a HFDT strategy in Chinese welding companies. Such training is an effective way to increase employees‘ awareness of quality and issues associated with quality assurance. The study identified widely existing problems in the Chinese welding industry and constructed a LCEPRS database that can be used in efforts to mitigate and avoid common problems. The work uses the theory of modular design, TWM and LEAN as tools for the implementation of the WQMC system.
Resumo:
The Arctic environment is changing constantly. There are several factors that constitute to the rate and immensity of the development. The region differs from the surrounding markets that most of the countries in the region have been used to. Therefore the purpose of the study was to understand how the political environment affects Finnish companies’ strategies and business operations. The issues analyzed were the political environment in the region, the business environment and economic development, and the opportunities and threats that the Finnish companies have in Arctic. The main theories were found from strategic management and market analysis tools. The different theories and definitions were gone through in order to understand the context of the study. This is a qualitative study that uses content analysis as its main method of analyzing the data. Therefore the data analyzed was gathered from already existing material and it was analyzed until the saturation point was found. This was done in order to minimize the risks related to using secondary data. The data collected was then categorized into themes accordingly. First the general political environment in the Arctic was studied, especially the Arctic Council and its work as the main political entity. From there the focus shifted to the business environment and the general opportunities and threats that are found from Arctic economic development. China offered another point of view to this as it represented a non-Arctic state with a keen interest on the region. Lastly the two previous objectives were combined and looked through from a Finnish perspective. Finnish companies have a great starting point to Arctic business and the operational business environment gives them the framework with which they have to operate in. As a conclusion it can be said that there are three main factors leading the Arctic economic development; the climate change, the development of technology, and the political environment. These set the framework with which the companies operating in the region must comply with. The industry that is likely to lead the development is the marine industry. Furthermore it became evident that the Finnish companies operating in the Arctic face many opportunities as well as threats which can be utilized, taken advantage of or controlled through effective strategic management. The key characteristics needed in the region are openness and understanding of the challenging environment and the ability to face and manage the arising challenges.
Resumo:
Cette thèse contribue à une théorie générale de la conception du projet. S’inscrivant dans une demande marquée par les enjeux du développement durable, l’objectif principal de cette recherche est la contribution d’un modèle théorique de la conception permettant de mieux situer l’utilisation des outils et des normes d’évaluation de la durabilité d’un projet. Les principes fondamentaux de ces instruments normatifs sont analysés selon quatre dimensions : ontologique, méthodologique, épistémologique et téléologique. Les indicateurs de certains effets contre-productifs reliés, en particulier, à la mise en compte de ces normes confirment la nécessité d’une théorie du jugement qualitatif. Notre hypothèse principale prend appui sur le cadre conceptuel offert par la notion de « principe de précaution » dont les premières formulations remontent du début des années 1970, et qui avaient précisément pour objectif de remédier aux défaillances des outils et méthodes d’évaluation scientifique traditionnelles. La thèse est divisée en cinq parties. Commençant par une revue historique des modèles classiques des théories de la conception (design thinking) elle se concentre sur l’évolution des modalités de prise en compte de la durabilité. Dans cette perspective, on constate que les théories de la « conception verte » (green design) datant du début des années 1960 ou encore, les théories de la « conception écologique » (ecological design) datant des années 1970 et 1980, ont finalement convergé avec les récentes théories de la «conception durable» (sustainable design) à partir du début des années 1990. Les différentes approches du « principe de précaution » sont ensuite examinées sous l’angle de la question de la durabilité du projet. Les standards d’évaluation des risques sont comparés aux approches utilisant le principe de précaution, révélant certaines limites lors de la conception d’un projet. Un premier modèle théorique de la conception intégrant les principales dimensions du principe de précaution est ainsi esquissé. Ce modèle propose une vision globale permettant de juger un projet intégrant des principes de développement durable et se présente comme une alternative aux approches traditionnelles d’évaluation des risques, à la fois déterministes et instrumentales. L’hypothèse du principe de précaution est dès lors proposée et examinée dans le contexte spécifique du projet architectural. Cette exploration débute par une présentation de la notion classique de «prudence» telle qu’elle fut historiquement utilisée pour guider le jugement architectural. Qu’en est-il par conséquent des défis présentés par le jugement des projets d’architecture dans la montée en puissance des méthodes d’évaluation standardisées (ex. Leadership Energy and Environmental Design; LEED) ? La thèse propose une réinterprétation de la théorie de la conception telle que proposée par Donald A. Schön comme une façon de prendre en compte les outils d’évaluation tels que LEED. Cet exercice révèle cependant un obstacle épistémologique qui devra être pris en compte dans une reformulation du modèle. En accord avec l’épistémologie constructiviste, un nouveau modèle théorique est alors confronté à l’étude et l’illustration de trois concours d'architecture canadienne contemporains ayant adopté la méthode d'évaluation de la durabilité normalisée par LEED. Une série préliminaire de «tensions» est identifiée dans le processus de la conception et du jugement des projets. Ces tensions sont ensuite catégorisées dans leurs homologues conceptuels, construits à l’intersection du principe de précaution et des théories de la conception. Ces tensions se divisent en quatre catégories : (1) conceptualisation - analogique/logique; (2) incertitude - épistémologique/méthodologique; (3) comparabilité - interprétation/analytique, et (4) proposition - universalité/ pertinence contextuelle. Ces tensions conceptuelles sont considérées comme autant de vecteurs entrant en corrélation avec le modèle théorique qu’elles contribuent à enrichir sans pour autant constituer des validations au sens positiviste du terme. Ces confrontations au réel permettent de mieux définir l’obstacle épistémologique identifié précédemment. Cette thèse met donc en évidence les impacts généralement sous-estimés, des normalisations environnementales sur le processus de conception et de jugement des projets. Elle prend pour exemple, de façon non restrictive, l’examen de concours d'architecture canadiens pour bâtiments publics. La conclusion souligne la nécessité d'une nouvelle forme de « prudence réflexive » ainsi qu’une utilisation plus critique des outils actuels d’évaluation de la durabilité. Elle appelle une instrumentalisation fondée sur l'intégration globale, plutôt que sur l'opposition des approches environnementales.
Resumo:
Afin d’adresser la variabilité interindividuelle observée dans la réponse pharmacocinétique à de nombreux médicaments, nous avons créé un panel de génotypage personnalisée en utilisant des méthodes de conception et d’élaboration d’essais uniques. Celles-ci ont pour but premier de capturer les variations génétiques présentent dans les gènes clés impliqués dans les processus d'absorption, de distribution, de métabolisme et d’excrétion (ADME) de nombreux agents thérapeutiques. Bien que ces gènes et voies de signalement sont impliqués dans plusieurs mécanismes pharmacocinétiques qui sont bien connues, il y a eu jusqu’à présent peu d'efforts envers l’évaluation simultanée d’un grand nombre de ces gènes moyennant un seul outil expérimental. La recherche pharmacogénomique peut être réalisée en utilisant deux approches: 1) les marqueurs fonctionnels peuvent être utilisés pour présélectionner ou stratifier les populations de patients en se basant sur des états métaboliques connus; 2) les marqueurs Tag peuvent être utilisés pour découvrir de nouvelles corrélations génotype-phénotype. Présentement, il existe un besoin pour un outil de recherche qui englobe un grand nombre de gènes ADME et variantes et dont le contenu est applicable à ces deux modèles d'étude. Dans le cadre de cette thèse, nous avons développé un panel d’essais de génotypage de 3,000 marqueurs génétiques ADME qui peuvent satisfaire ce besoin. Dans le cadre de ce projet, les gènes et marqueurs associés avec la famille ADME ont été sélectionnés en collaboration avec plusieurs groupes du milieu universitaire et de l'industrie pharmaceutique. Pendant trois phases de développement de cet essai de génotypage, le taux de conversion pour 3,000 marqueurs a été amélioré de 83% à 97,4% grâce à l'incorporation de nouvelles stratégies ayant pour but de surmonter les zones d'interférence génomiques comprenant entre autres les régions homologues et les polymorphismes sous-jacent les régions d’intérêt. La précision du panel de génotypage a été validée par l’évaluation de plus de 200 échantillons pour lesquelles les génotypes sont connus pour lesquels nous avons obtenu une concordance > 98%. De plus, une comparaison croisée entre nos données provenant de cet essai et des données obtenues par différentes plateformes technologiques déjà disponibles sur le marché a révélé une concordance globale de > 99,5%. L'efficacité de notre stratégie de conception ont été démontrées par l'utilisation réussie de cet essai dans le cadre de plusieurs projets de recherche où plus de 1,000 échantillons ont été testés. Nous avons entre autre évalué avec succès 150 échantillons hépatiques qui ont été largement caractérisés pour plusieurs phénotypes. Dans ces échantillons, nous avons pu valider 13 gènes ADME avec cis-eQTL précédemment rapportés et de découvrir et de 13 autres gènes ADME avec cis eQTLs qui n'avaient pas été observés en utilisant des méthodes standard. Enfin, à l'appui de ce travail, un outil logiciel a été développé, Opitimus Primer, pour aider pour aider au développement du test. Le logiciel a également été utilisé pour aider à l'enrichissement de cibles génomiques pour d'expériences séquençage. Le contenu ainsi que la conception, l’optimisation et la validation de notre panel le distingue largement de l’ensemble des essais commerciaux couramment disponibles sur le marché qui comprennent soit des marqueurs fonctionnels pour seulement un petit nombre de gènes, ou alors n’offre pas une couverture adéquate pour les gènes connus d’ADME. Nous pouvons ainsi conclure que l’essai que nous avons développé est et continuera certainement d’être un outil d’une grande utilité pour les futures études et essais cliniques dans le domaine de la pharmacocinétique, qui bénéficieraient de l'évaluation d'une longue liste complète de gènes d’ADME.
Resumo:
The shift from print to digital information has a high impact on all components of the academic library system in India especially the users, services and the staff. Though information is considered as an important resource, the use of ICT tools to collect and disseminate information has been in a slow pace in majority of the University libraries This may be due to various factors like insufficient funds, inadequate staff trained in handling computers and software packages, administrative concerns etc. In Kerala, automation has been initiated in almost all University libraries using library automation software and is under different stages of completion. There are not much studies conducted about the effects of information communication technologies on the professional activities of library professionals in the universities in Kerala. It is important to evaluate whether progress in ICT has had any impact on the library profession in these highest educational institutions. The aim of the study is to assess whether the developments in information communication technologies have any influence on the library professionals’ professional development, and the need for further education and training in the profession and evaluate their skills in handling developments in ICT. The total population of the study is 252 including the permanently employed professional library staff in central libraries and departmental libraries in the main campuses of the universities under study. This is almost a census study of the defined population of users. The questionnaire method was adopted for collection of data for this study, supplemented by interviews of Librarians to gather additional information. Library Professionals have a positive approach towards ICT applications and services in Libraries, but majority do not have the opportunities to develop their skills and competencies in their work environment. To develop competitive personnel in a technologically advanced world, high priority must be given to develop competence in ICT applications, library management and soft skills in library professionals, by the University administrators and Library associations. Library science schools and teaching departments across the country have to take significant steps to revise library science curriculum, and incorporate significant changes to achieve the demands and challenges of library science profession.
Resumo:
Sonar signal processing comprises of a large number of signal processing algorithms for implementing functions such as Target Detection, Localisation, Classification, Tracking and Parameter estimation. Current implementations of these functions rely on conventional techniques largely based on Fourier Techniques, primarily meant for stationary signals. Interestingly enough, the signals received by the sonar sensors are often non-stationary and hence processing methods capable of handling the non-stationarity will definitely fare better than Fourier transform based methods.Time-frequency methods(TFMs) are known as one of the best DSP tools for nonstationary signal processing, with which one can analyze signals in time and frequency domains simultaneously. But, other than STFT, TFMs have been largely limited to academic research because of the complexity of the algorithms and the limitations of computing power. With the availability of fast processors, many applications of TFMs have been reported in the fields of speech and image processing and biomedical applications, but not many in sonar processing. A structured effort, to fill these lacunae by exploring the potential of TFMs in sonar applications, is the net outcome of this thesis. To this end, four TFMs have been explored in detail viz. Wavelet Transform, Fractional Fourier Transfonn, Wigner Ville Distribution and Ambiguity Function and their potential in implementing five major sonar functions has been demonstrated with very promising results. What has been conclusively brought out in this thesis, is that there is no "one best TFM" for all applications, but there is "one best TFM" for each application. Accordingly, the TFM has to be adapted and tailored in many ways in order to develop specific algorithms for each of the applications.
Resumo:
Hevea latex is a natural biological liquid of very complex composition .Besides rubber hydrocarbons,it contains many proteinous and resinous substances,carbohydrates,inorganic matter,water,and others.The Dry Rubber Content (DRC) of latex varies according to season, tapping system,weather,soil conditions ,clone,age of the tree etc. The true DRC of the latex must be determined to ensure fair prices for the latex during commercial exchange.The DRC of Hevea latex is a very familiar term to all in the rubber industry.It has been the basis for incentive payments to tappers who bring in more than the daily agreed poundage of latex.It is an important parameter for rubber and latex processing industries for automation and verious decesion making processes.This thesis embodies the efforts made by me to determine the DRC of rubber latex following different analytical tools such as MIR absorption,thermal analysis.dielectric spectroscopy and NIR reflectance.The rubber industry is still Looking for a compact instrument that is accurate economical,easy to use and environment friendly.I hope the results presented in this thesis will help to realise this goal in the near future.
Resumo:
Antennas are necessary and vital components of communication and radar systems, but sometimes their inability to adjust to new operating scenarios can limit system performance. Reconfigurable antennas can adjust with changing system requirements or environmental conditions and provide additional levels of functionality that may result in wider instantaneous frequency bandwidths, more extensive scan volumes, and radiation patterns with more desirable side lobe distributions. Their agility and diversity created new horizons for different types of applications especially in cognitive radio, Multiple Input Multiple Output Systems, satellites and many other applications. Reconfigurable antennas satisfy the requirements for increased functionality, such as direction finding, beam steering, radar, control and command, within a confined volume. The intelligence associated with the reconfigurable antennas revolved around switching mechanisms utilized. In the present work, we have investigated frequency reconfigurable polarization diversity antennas using two methods: 1. By using low-loss, high-isolation switches such as PIN diode, the antenna can be structurally reconfigured to maintain the elements near their resonant dimensions for different frequency bands and/or polarization. 2. Secondly, the incorporation of variable capacitors or varactors, to overcome many problems faced in using switches and their biasing. The performances of these designs have been studied using standard simulation tools used in industry/academia and they have been experimentally verified. Antenna design guidelines are also deduced by accounting the resonances. One of the major contributions of the thesis lies in the analysis of the designed antennas using FDTD based numerical computation to validate their performance.