958 resultados para Production methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The most common reason for a low-voltage induction motor breakdown is a bearing failure. Along with the increasing popularity of modern frequency converters, bearing failures have become the most important motor fault type. Conditions in which bearing currents are likely to occur are generated as a side effect of fast du/dt switching transients. Once present, different types of bearing currents can accelerate the mechanical wear of bearings by causing deformation of metal parts in the bearing and degradation of the lubricating oil properties.The bearing current phenomena are well known, and several bearing current measurement and mitigation methods have been proposed. Nevertheless, in order to develop more feasible methods to measure and mitigate bearing currents, better knowledge of the phenomena is required. When mechanical wear is caused by bearing currents, the resulting aging impact has to be monitored and dealt with. Moreover, because of the stepwise aging mechanism, periodically executed condition monitoring measurements have been found ineffective. Thus, there is a need for feasible bearing current measurement methods that can be applied in parallel with the normal operation of series production drive systems. In order to reach the objectives of feasibility and applicability, nonintrusive measurement methods are preferred. In this doctoral dissertation, the characteristics and conditions of bearings that are related to the occurrence of different kinds of bearing currents are studied. Further, the study introduces some nonintrusive radio-frequency-signal-based approaches to detect and measure parameters that are associated with the accelerated bearing wear caused by bearing currents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis studies the possibility to use lean tools and methods in a quotation process which is carried out in an office environment. The aim of the study was to find out and test the relevant lean tools and methods which can help to balance and standardize the quotation process, and reduce the variance in quotation lead times and in quality. Seminal works, researches and guide books related to the topic were used as the basis for the theory development. Based on the literature review and the case company’s own lean experience, the applicable lean tools and methods were selected to be tested by a sales support team. Leveling production, by product categorization and value stream mapping, was a key method to be used to balance the quotation process. 5S method was started concurrently for standardizing the work. Results of the testing period showed that lean tools and methods are applicable in office process and selected tools and methods helped to balance and standardize the quotation process. Case company’s sales support team decided to implement new lean based quotation process model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An electric system based on renewable energy faces challenges concerning the storage and utilization of energy due to the intermittent and seasonal nature of renewable energy sources. Wind and solar photovoltaic power productions are variable and difficult to predict, and thus electricity storage will be needed in the case of basic power production. Hydrogen’s energetic potential lies in its ability and versatility to store chemical energy, to serve as an energy carrier and as feedstock for various industries. Hydrogen is also used e.g. in the production of biofuels. The amount of energy produced during hydrogen combustion is higher than any other fuel’s on a mass basis with a higher-heating-value of 39.4 kWh/kg. However, even though hydrogen is the most abundant element in the universe, on Earth most hydrogen exists in molecular forms such as water. Therefore, hydrogen must be produced and there are various methods to do so. Today, the majority hydrogen comes from fossil fuels, mainly from steam methane reforming, and only about 4 % of global hydrogen comes from water electrolysis. Combination of electrolytic production of hydrogen from water and supply of renewable energy is attracting more interest due to the sustainability and the increased flexibility of the resulting energy system. The preferred option for intermittent hydrogen storage is pressurization in tanks since at ambient conditions the volumetric energy density of hydrogen is low, and pressurized tanks are efficient and affordable when the cycling rate is high. Pressurized hydrogen enables energy storage in larger capacities compared to battery technologies and additionally the energy can be stored for longer periods of time, on a time scale of months. In this thesis, the thermodynamics and electrochemistry associated with water electrolysis are described. The main water electrolysis technologies are presented with state-of-the-art specifications. Finally, a Power-to-Hydrogen infrastructure design for Lappeenranta University of Technology is presented. Laboratory setup for water electrolysis is specified and factors affecting its commissioning in Finland are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective of this Master’s thesis is to develop a cost allocation model for a leading food industry company in Finland. The goal is to develop an allocation method for fixed overhead expenses produced in a specific production unit and create a plausible tracking system for product costs. The second objective is to construct an allocation model and modify the created model to be suited for other units as well. Costs, activities, drivers and appropriate allocation methods are studied. This thesis is started with literature review of existing theory of ABC, inspecting cost information and then conducting interviews with officials to get a general view of the requirements for the model to be constructed. The familiarization of the company started with becoming acquainted with the existing cost accounting methods. The main proposals for a new allocation model were revealed through interviews, which were utilized in setting targets for developing the new allocation method. As a result of this thesis, an Excel-based model is created based on the theoretical and empiric data. The new system is able to handle overhead costs in more detail improving the cost awareness, transparency in cost allocations and enhancing products’ cost structure. The improved cost awareness is received by selecting the best possible cost drivers for this situation. Also the capacity changes are taken into consideration, such as usage of practical or normal capacity instead of theoretical is suggested to apply. Also some recommendations for further development are made about capacity handling and cost collection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent studies have reported that exogenous gangliosides, the sialic acid-containing glycosphingolipids, are able to modulate many cellular functions. We examined the effect of micelles of mono- and trisialoganglioside GM1 and GT1b on the production of reactive oxygen species by stimulated human polymorphonuclear neutrophils using different spectroscopic methods. The results indicated that exogenous gangliosides did not influence extracellular superoxide anion (O2.-) generation by polymorphonuclear neutrophils activated by receptor-dependent formyl-methionyl-leucyl-phenylalanine. However, when neutrophils were stimulated by receptor-bypassing phorbol 12-myristate 13-acetate (PMA), gangliosides above their critical micellar concentrations prolonged the lag time preceding the production in a concentration-dependent way, without affecting total extracellular O2.- generation detected by superoxide dismutase-inhibitable cytochrome c reduction. The effect of ganglioside GT1b (100 µM) on the increase in lag time was shown to be significant by means of both superoxide dismutase-inhibitable cytochrome c reduction assay and electron paramagnetic resonance spectroscopy (P < 0.0001 and P < 0.005, respectively). The observed phenomena can be attributed to the ability of ganglioside micelles attached to the cell surface to slow down PMA uptake, thus increasing the diffusion barrier and consequently delaying membrane events responsible for PMA-stimulated O2.- production.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intercellular adhesion molecule-1 (ICAM-1) is an important factor in the progression of inflammatory responses in vivo. To develop a new anti-inflammatory drug to block the biological activity of ICAM-1, we produced a monoclonal antibody (Ka=4.19×10−8 M) against human ICAM-1. The anti-ICAM-1 single-chain variable antibody fragment (scFv) was expressed at a high level as inclusion bodies in Escherichia coli. We refolded the scFv (Ka=2.35×10−7 M) by ion-exchange chromatography, dialysis, and dilution. The results showed that column chromatography refolding by high-performance Q Sepharose had remarkable advantages over conventional dilution and dialysis methods. Furthermore, the anti-ICAM-1 scFv yield of about 60 mg/L was higher with this method. The purity of the final product was greater than 90%, as shown by denaturing gel electrophoresis. Enzyme-linked immunosorbent assay, cell culture, and animal experiments were used to assess the immunological properties and biological activities of the renatured scFv.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Failure Mode and Effect Analysis (FMEA) was applied for risk assessment of confectionary manufacturing, in whichthe traditional methods and equipment were intensively used in the production. Potential failure modes and effects as well as their possible causes were identified in the process flow. Processing stages that involve intensive handling of food by workers had the highest risk priority numbers (RPN = 216 and 189), followed by chemical contamination risks in different stages of the process. The application of corrective actions substantially reduced the RPN (risk priority number) values. Therefore, the implementation of FMEA (The Failure Mode and Effect Analysis) model in confectionary manufacturing improved the safety and quality of the final products.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Harmful algal blooms (HABs) are events caused by the massive proliferation of microscopic, often photosynthetic organisms that inhabit both fresh and marine waters. Although HABs are essentially a natural phenomenon, they now cause worldwide concern. Recent anthropogenic effects, such as climate change and eutrophication via nutrient runoff, can be seen in their increased prevalence and severity. Cyanobacteria and dinoflagellates are often the causative organisms of HABs. In addition to adverse effects caused by the sheer biomass, certain species produce highly potent toxic compounds: hepatotoxic microcystins are produced exclusively by cyanobacteria and neurotoxic saxitoxins, also known as paralytic shellfish toxins (PSTs), by both cyanobacteria and dinoflagellates. Specific biosynthetic genes in the cyanobacterial genomes direct the production of microcystin and paralytic shellfish toxins. Recently also the first paralytic shellfish toxin gene sequences from dinoflagellate genomes have been elucidated. The public health risks presented by HABs are evident, but the monitoring and prediction of toxic events is challenging. Characterization of the genetic background of toxin biosynthesis, including that of microcystins and paralytic shellfish toxins, has made it possible to develop highly sensitive molecular tools which have shown promise in the monitoring and study of potentially toxic microalgae. In this doctoral work, toxin-specific genes were targeted in the developed PCR and qPCR assays for the detection and quantification of potentially toxic cyanobacteria and dinoflagellates in the environment. The correlation between the copy numbers of the toxin biosynthesis genes and toxin production were investigated to assess whether the developed methods could be used to predict toxin concentrations. The nature of the correlation between gene copy numbers and amount of toxin produced varied depending on the targeted gene and the producing organism. The combined mcyB copy numbers of three potentially microcystin-producing cyanobacterial genera showed significant positive correlation to the observed total toxin production. However, the presence of PST-specific sxtA, sxtG, and sxtB genes of cyanobacterial origin was found to be a poor predictor of toxin production in the studied area. Conversely, the dinoflagellate sxtA4 was a good qualitative indicator of a neurotoxic bloom both in the laboratory and in the field, and population densities reflected well the observed toxin concentrations. In conclusion, although the specificity of each potential targeted toxin biosynthesis gene must be assessed individually during method development, the results obtained in this doctoral study support the use of quantitative PCR -based approaches in the monitoring of toxic cyanobacteria and dinoflagellates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Global warming is one of the most alarming problems of this century. Initial scepticism concerning its validity is currently dwarfed by the intensification of extreme weather events whilst the gradual arising level of anthropogenic CO2 is pointed out as its main driver. Most of the greenhouse gas (GHG) emissions come from large point sources (heat and power production and industrial processes) and the continued use of fossil fuels requires quick and effective measures to meet the world’s energy demand whilst (at least) stabilizing CO2 atmospheric levels. The framework known as Carbon Capture and Storage (CCS) – or Carbon Capture Utilization and Storage (CCUS) – comprises a portfolio of technologies applicable to large‐scale GHG sources for preventing CO2 from entering the atmosphere. Amongst them, CO2 capture and mineralisation (CCM) presents the highest potential for CO2 sequestration as the predicted carbon storage capacity (as mineral carbonates) far exceeds the estimated levels of the worldwide identified fossil fuel reserves. The work presented in this thesis aims at taking a step forward to the deployment of an energy/cost effective process for simultaneous capture and storage of CO2 in the form of thermodynamically stable and environmentally friendly solid carbonates. R&D work on the process considered here began in 2007 at Åbo Akademi University in Finland. It involves the processing of magnesium silicate minerals with recyclable ammonium salts for extraction of magnesium at ambient pressure and 400‐440⁰C, followed by aqueous precipitation of magnesium in the form of hydroxide, Mg(OH)2, and finally Mg(OH)2 carbonation in a pressurised fluidized bed reactor at ~510⁰C and ~20 bar PCO2 to produce high purity MgCO3. Rock material taken from the Hitura nickel mine, Finland, and serpentinite collected from Bragança, Portugal, were tested for magnesium extraction with both ammonium sulphate and bisulphate (AS and ABS) for determination of optimal operation parameters, primarily: reaction time, reactor type and presence of moisture. Typical efficiencies range from 50 to 80% of magnesium extraction at 350‐450⁰C. In general ABS performs better than AS showing comparable efficiencies at lower temperature and reaction times. The best experimental results so far obtained include 80% magnesium extraction with ABS at 450⁰C in a laboratory scale rotary kiln and 70% Mg(OH)2 carbonation in the PFB at 500⁰C, 20 bar CO2 pressure for 15 minutes. The extraction reaction with ammonium salts is not at all selective towards magnesium. Other elements like iron, nickel, chromium, copper, etc., are also co‐extracted. Their separation, recovery and valorisation are addressed as well and found to be of great importance. The assessment of the exergetic performance of the process was carried out using Aspen Plus® software and pinch analysis technology. The choice of fluxing agent and its recovery method have a decisive sway in the performance of the process: AS is recovered by crystallisation and in general the whole process requires more exergy (2.48–5.09 GJ/tCO2sequestered) than ABS (2.48–4.47 GJ/tCO2sequestered) when ABS is recovered by thermal decomposition. However, the corrosive nature of molten ABS and operational problems inherent to thermal regeneration of ABS prohibit this route. Regeneration of ABS through addition of H2SO4 to AS (followed by crystallisation) results in an overall negative exergy balance (mainly at the expense of low grade heat) but will flood the system with sulphates. Although the ÅA route is still energy intensive, its performance is comparable to conventional CO2 capture methods using alkanolamine solvents. An energy‐neutral process is dependent on the availability and quality of nearby waste heat and economic viability might be achieved with: magnesium extraction and carbonation levels ≥ 90%, the processing of CO2‐containing flue gases (eliminating the expensive capture step) and production of marketable products.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The recent rapid development of biotechnological approaches has enabled the production of large whole genome level biological data sets. In order to handle thesedata sets, reliable and efficient automated tools and methods for data processingand result interpretation are required. Bioinformatics, as the field of studying andprocessing biological data, tries to answer this need by combining methods and approaches across computer science, statistics, mathematics and engineering to studyand process biological data. The need is also increasing for tools that can be used by the biological researchers themselves who may not have a strong statistical or computational background, which requires creating tools and pipelines with intuitive user interfaces, robust analysis workflows and strong emphasis on result reportingand visualization. Within this thesis, several data analysis tools and methods have been developed for analyzing high-throughput biological data sets. These approaches, coveringseveral aspects of high-throughput data analysis, are specifically aimed for gene expression and genotyping data although in principle they are suitable for analyzing other data types as well. Coherent handling of the data across the various data analysis steps is highly important in order to ensure robust and reliable results. Thus,robust data analysis workflows are also described, putting the developed tools andmethods into a wider context. The choice of the correct analysis method may also depend on the properties of the specific data setandthereforeguidelinesforchoosing an optimal method are given. The data analysis tools, methods and workflows developed within this thesis have been applied to several research studies, of which two representative examplesare included in the thesis. The first study focuses on spermatogenesis in murinetestis and the second one examines cell lineage specification in mouse embryonicstem cells.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most soybean pathogens are seed transmitted, deserving emphasis the fungus Sclerotinia sclerotiorum, which has been presenting worrying levels of field incidence in some soybean cropping areas in several Brazilian states. The objective of this study was to verify the efficiency of different methods for detecting S. sclerotiorum on soybean seeds artificially infected in the laboratory and from field production areas with a historical disease incidence. Seed samples of seven different cultivars collected from naturally infested fields, and one seed sample artificially inoculated in the laboratory were used. The following detection methods recommended in the literature were compared: Blotter test at 7 ºC, 14 ºC, and 21 ºC; Rolled Paper; and Neon-S. Results demonstrated that these methods showed no repeatability and had a low sensitivity for detecting the pathogen in seeds from areas with disease incidence. They were effective, however, for its detection on artificially inoculated seeds. In the Blotter test method at 7 ºC, there was a lower incidence of other fungi considered undesirable during seed analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In any manufacturing system, there are many factors that are affecting and limiting the capacity of the entire system. This thesis addressed a study on how to improve the production capacity in a Finnish company (Viljavuuspalvelu Oy) through different methods like bottleneck analysis, Overall Equipment Effectiveness (OEE), and Just in Time production. Four analyzing methods have been studied in order to detect the bottleneck machine in Viljavuuspalvelu Oy. The results shows that the bottleneck machine in the industrial area that constraint the production is the grinding machine while the bottleneck machine in the laboratory section is the photometry machine. In addition, the Overall Equipment Effectiveness (OEE) of the entire system of the studied case was calculated and it has been found that the OEE of the Viljavuuspalvelu Oy is 35.75%. Moreover, two methods on how to increase the OEE were studied and it was shown that either the total output of the company should be 1254 samples/shift in order to have an OEE around 85% which is considered as a world class or the Ideal run rate should be 1.45 pieces/minute. In addition, some realistic methods are applied based on the finding in this thesis to increase the OEE factor in the company and in one realistic method the % OEE has increase to 62.59%. Finally, an explanation on how to implement the Just in Time production in Viljavuuspalvelu Oy has been studied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adjustement is an ongoing process by which factors of reallocated to equalize their returns in different uses. Adjustment occurs though market mechanisms or intrafirm reallocation of resources as a result of changes in terms of trade, government policies, resource availability, technological change, etc. These changes alter production opportunities and production, transaction and information costs, and consequently modify production functions, organizational design, etc. In this paper we define adjustment (section 2); review empirical estimates of the extent of adjustment in Canada and abroad (section 3); review selected features of the trade policy and adjustment context of relevance for policy formulation among which: slow growth, a shift to services, a shift to the Pacific Rim, the internationalization of production, investment distribution communications the growing use of NTB's, changes in foreign direct investment patterns, intrafirm and intraindustry trade, interregional trade flows, differences in micro economic adjustment processes of adjustment as between subsidiaries and Canadian companies (section 4); examine methodologies and results of studies of the impact of trade liberalization on jobs (section 5); and review the R. Harris general equilibrium model (section 6). Our conclusion emphasizes the importance of harmonizing commercial and domestic policies dealing with adjustment (section 7). We close with a bibliography of relevant publications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose finite sample tests and confidence sets for models with unobserved and generated regressors as well as various models estimated by instrumental variables methods. The validity of the procedures is unaffected by the presence of identification problems or \"weak instruments\", so no detection of such problems is required. We study two distinct approaches for various models considered by Pagan (1984). The first one is an instrument substitution method which generalizes an approach proposed by Anderson and Rubin (1949) and Fuller (1987) for different (although related) problems, while the second one is based on splitting the sample. The instrument substitution method uses the instruments directly, instead of generated regressors, in order to test hypotheses about the \"structural parameters\" of interest and build confidence sets. The second approach relies on \"generated regressors\", which allows a gain in degrees of freedom, and a sample split technique. For inference about general possibly nonlinear transformations of model parameters, projection techniques are proposed. A distributional theory is obtained under the assumptions of Gaussian errors and strictly exogenous regressors. We show that the various tests and confidence sets proposed are (locally) \"asymptotically valid\" under much weaker assumptions. The properties of the tests proposed are examined in simulation experiments. In general, they outperform the usual asymptotic inference methods in terms of both reliability and power. Finally, the techniques suggested are applied to a model of Tobin’s q and to a model of academic performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We survey recent axiomatic results in the theory of cost-sharing. In this litterature, a method computes the individual cost shares assigned to the users of a facility for any profile of demands and any monotonic cost function. We discuss two theories taking radically different views of the asymmetries of the cost function. In the full responsibility theory, each agent is accountable for the part of the costs that can be unambiguously separated and attributed to her own demand. In the partial responsibility theory, the asymmetries of the cost function have no bearing on individual cost shares, only the differences in demand levels matter. We describe several invariance and monotonicity properties that reflect both normative and strategic concerns. We uncover a number of logical trade-offs between our axioms, and derive axiomatic characterizations of a handful of intuitive methods: in the full responsibility approach, the Shapley-Shubik, Aumann-Shapley, and subsidyfree serial methods, and in the partial responsibility approach, the cross-subsidizing serial method and the family of quasi-proportional methods.