906 resultados para machine-tool industry
Resumo:
This thesis has two items: biofouling and antifouling in paper industry. Biofouling means unwanted microbial accumulation on surfaces causing e.g. disturbances in industrial processes, contamination of medical devices or of water distribution networks. Antifouling focuses on preventing accumulation of the biofilms in undesired places. Deinococcus geothermalis is a pink-pigmented, thermophilic bacterium, and extremely resistant towards radiation, UV-light and desiccation and known as a biofouler of paper machines forming firm and biocide resistant biofilms on the stainless steel surfaces. The compact structure of biofilm microcolonies of D. geothermalis E50051 and the adhesion into abiotic surfaces were investigated by confocal laser scanning microscope combined with carbohydrate specific fluorescently labelled lectins. The extracellular polymeric substance in D. geothermalis microcolonies was found to be a composite of at least five different glycoconjugates contributing to adhesion, functioning as structural elements, putative storages for water, gliding motility and likely also to protection. The adhesion threads that D. geothermalis seems to use to adhere on an abiotic surface and to anchor itself to the neighbouring cells were shown to be protein. Four protein components of type IV pilin were identified. In addition, the lectin staining showed that the adhesion threads were covered with galactose containing glycoconjugates. The threads were not exposed on planktic cells indicating their primary role in adhesion and in biofilm formation. I investigated by quantitative real-time PCR the presence of D. geothermalis in biofilms, deposits, process waters and paper end products from 24 paper and board mills. The primers designed for doing this were targeted to the 16S rRNA gene of D. geothermalis. We found D. geothermalis DNA from 9 machines, in total 16 samples of the 120 mill samples searched for. The total bacterial content varied in those samples between 107 to 3 ×1010 16S rRNA gene copies g-1. The proportion of D. geothermalis in those same samples was minor, 0.03 1.3 % of the total bacterial content. Nevertheless D. geothermalis may endanger paper quality as its DNA was shown in an end product. As an antifouling method towards biofilms we studied the electrochemical polarization. Two novel instruments were designed for this work. The double biofilm analyzer was designed for search for a polarization program that would eradicate D. geothermalis biofilm or from stainless steel under conditions simulating paper mill environment. The Radbox instrument was designed to study the generation of reactive oxygen species during the polarization that was effective in antifouling of D. geothermalis. We found that cathodic character and a pulsed mode of polarization were required to achieve detaching D. geothermalis biofilm from stainless steel. We also found that the efficiency of polarization was good on submerged, and poor on splash area biofilms. By adding oxidative biocides, bromochloro-5,5-dimethylhydantoin, 2,2-dibromo-2-cyanodiacetamide or peracetic acid gave additive value with polarization, being active on splash area biofilms. We showed that the cathodically weighted pulsed polarization that was active in removing D. geothermalis was also effective in generation of reactive oxygen species. It is possible that the antifouling effect relied on the generation of ROS on the polarized steel surfaces. Antifouling method successful towards D. geothermalis that is a tenacious biofouler and possesses a high tolerance to oxidative stressors could be functional also towards other biofoulers and applicable in wet industrial processes elsewhere.
Resumo:
New materials in concrete constructions have been widely used to improve various properties such as impact resistance, strength and durability. Polymer modified concrete is one of the new materials which has been developed for potential application in the construction industry. This Paper describes the use of polymer latex for foundation blocks subjected to dynamic loads. Experiments were conducted using ordinary concrete and latex modified concrete footings of three different thicknesses, for three static loads at four excitation levels. Experimental results have revealed that the amplitude of resonance is reduced considerably in the latex modified concrete footings.
Resumo:
We consider the problem of minimizing the total completion time on a single batch processing machine. The set of jobs to be scheduled can be partitioned into a number of families, where all jobs in the same family have the same processing time. The machine can process at most B jobs simultaneously as a batch, and the processing time of a batch is equal to the processing time of the longest job in the batch. We analyze that properties of an optimal schedule and develop a dynamic programming algorithm of polynomial time complexity when the number of job families is fixed. The research is motivated by the problem of scheduling burn-in ovens in the semiconductor industry
Resumo:
The use of the shear wave velocity data as a field index for evaluating the liquefaction potential of sands is receiving increased attention because both shear wave velocity and liquefaction resistance are similarly influenced by many of the same factors such as void ratio, state of stress, stress history and geologic age. In this paper, the potential of support vector machine (SVM) based classification approach has been used to assess the liquefaction potential from actual shear wave velocity data. In this approach, an approximate implementation of a structural risk minimization (SRM) induction principle is done, which aims at minimizing a bound on the generalization error of a model rather than minimizing only the mean square error over the data set. Here SVM has been used as a classification tool to predict liquefaction potential of a soil based on shear wave velocity. The dataset consists the information of soil characteristics such as effective vertical stress (sigma'(v0)), soil type, shear wave velocity (V-s) and earthquake parameters such as peak horizontal acceleration (a(max)) and earthquake magnitude (M). Out of the available 186 datasets, 130 are considered for training and remaining 56 are used for testing the model. The study indicated that SVM can successfully model the complex relationship between seismic parameters, soil parameters and the liquefaction potential. In the model based on soil characteristics, the input parameters used are sigma'(v0), soil type. V-s, a(max) and M. In the other model based on shear wave velocity alone uses V-s, a(max) and M as input parameters. In this paper, it has been demonstrated that Vs alone can be used to predict the liquefaction potential of a soil using a support vector machine model. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Low-pressure MOCVD, with tris(2,4 pentanedionato)aluminum(III) as the precursor, was used in the present investigation to coat alumina on to cemented carbide cutting tools. To evaluate the MOCVD process, the efficiency in cutting operations of MOCVD-coated tools was compared with that of tools coated using the industry-standard CVD process.Three multilayer cemented carbide cutting tool inserts, viz., TiN/TiC/WC, CVD-coated Al2O3 on TiN/TiC/WC, and MOCVD-coated Al2O3 on TiN/TiC/WC, were compared in the dry turning of mild steel. Turning tests were conducted for cutting speeds ranging from 14 to 47 m/min, for a depth of cut from 0.25 to 1 mm, at the constant feed rate of 0.2 mm/min. The axial, tangential, and radial forces were measured using a lathe tool dynamometer for different cutting parameters, and the machined work pieces were tested for surface roughness. The results indicate that, in most of the cases examined, the MOCVD-coated inserts produced a smoother surface finish, while requiring lower cutting forces, indicating that MOCVD produces the best-performing insert, followed by the CVD-coated one. The superior performance of MOCVD-alumina is attributed to the co-deposition of carbon with the oxide, due to the very nature of the precursor used, leading to enhanced mechanical properties for cutting applications in harsh environment.
Resumo:
Sport hunting is often proposed as a tool to support the conservation of large carnivores. However, it is challenging to provide tangible economic benefits from this activity as an incentive for local people to conserve carnivores. We assessed economic gains from sport hunting and poaching of leopards (Panthera pardus), costs of leopard depredation of livestock, and attitudes of people toward leopards in Niassa National Reserve, Mozambique. We sent questionnaires to hunting concessionaires (n = 8) to investigate the economic value of and the relative importance of leopards relative to other key trophy-hunted species. We asked villagers (n = 158) the number of and prices for leopards poached in the reserve and the number of goats depredated by leopard. Leopards were the mainstay of the hunting industry; a single animal was worth approximately U.S.$24,000. Most safari revenues are retained at national and international levels, but poached leopard are illegally traded locally for small amounts ($83). Leopards depredated 11 goats over 2 years in 2 of 4 surveyed villages resulting in losses of $440 to 6 households. People in these households had negative attitudes toward leopards. Although leopard sport hunting generates larger gross revenues than poaching, illegal hunting provides higher economic benefits for households involved in the activity. Sport-hunting revenues did not compensate for the economic losses of livestock at the household level. On the basis of our results, we propose that poaching be reduced by increasing the costs of apprehension and that the economic benefits from leopard sport hunting be used to improve community livelihoods and provide incentives not to poach.
Resumo:
In the first part of the thesis we explore three fundamental questions that arise naturally when we conceive a machine learning scenario where the training and test distributions can differ. Contrary to conventional wisdom, we show that in fact mismatched training and test distribution can yield better out-of-sample performance. This optimal performance can be obtained by training with the dual distribution. This optimal training distribution depends on the test distribution set by the problem, but not on the target function that we want to learn. We show how to obtain this distribution in both discrete and continuous input spaces, as well as how to approximate it in a practical scenario. Benefits of using this distribution are exemplified in both synthetic and real data sets.
In order to apply the dual distribution in the supervised learning scenario where the training data set is fixed, it is necessary to use weights to make the sample appear as if it came from the dual distribution. We explore the negative effect that weighting a sample can have. The theoretical decomposition of the use of weights regarding its effect on the out-of-sample error is easy to understand but not actionable in practice, as the quantities involved cannot be computed. Hence, we propose the Targeted Weighting algorithm that determines if, for a given set of weights, the out-of-sample performance will improve or not in a practical setting. This is necessary as the setting assumes there are no labeled points distributed according to the test distribution, only unlabeled samples.
Finally, we propose a new class of matching algorithms that can be used to match the training set to a desired distribution, such as the dual distribution (or the test distribution). These algorithms can be applied to very large datasets, and we show how they lead to improved performance in a large real dataset such as the Netflix dataset. Their computational complexity is the main reason for their advantage over previous algorithms proposed in the covariate shift literature.
In the second part of the thesis we apply Machine Learning to the problem of behavior recognition. We develop a specific behavior classifier to study fly aggression, and we develop a system that allows analyzing behavior in videos of animals, with minimal supervision. The system, which we call CUBA (Caltech Unsupervised Behavior Analysis), allows detecting movemes, actions, and stories from time series describing the position of animals in videos. The method summarizes the data, as well as it provides biologists with a mathematical tool to test new hypotheses. Other benefits of CUBA include finding classifiers for specific behaviors without the need for annotation, as well as providing means to discriminate groups of animals, for example, according to their genetic line.
Resumo:
Optical Coherence Tomography(OCT) is a popular, rapidly growing imaging technique with an increasing number of bio-medical applications due to its noninvasive nature. However, there are three major challenges in understanding and improving an OCT system: (1) Obtaining an OCT image is not easy. It either takes a real medical experiment or requires days of computer simulation. Without much data, it is difficult to study the physical processes underlying OCT imaging of different objects simply because there aren't many imaged objects. (2) Interpretation of an OCT image is also hard. This challenge is more profound than it appears. For instance, it would require a trained expert to tell from an OCT image of human skin whether there is a lesion or not. This is expensive in its own right, but even the expert cannot be sure about the exact size of the lesion or the width of the various skin layers. The take-away message is that analyzing an OCT image even from a high level would usually require a trained expert, and pixel-level interpretation is simply unrealistic. The reason is simple: we have OCT images but not their underlying ground-truth structure, so there is nothing to learn from. (3) The imaging depth of OCT is very limited (millimeter or sub-millimeter on human tissues). While OCT utilizes infrared light for illumination to stay noninvasive, the downside of this is that photons at such long wavelengths can only penetrate a limited depth into the tissue before getting back-scattered. To image a particular region of a tissue, photons first need to reach that region. As a result, OCT signals from deeper regions of the tissue are both weak (since few photons reached there) and distorted (due to multiple scatterings of the contributing photons). This fact alone makes OCT images very hard to interpret.
This thesis addresses the above challenges by successfully developing an advanced Monte Carlo simulation platform which is 10000 times faster than the state-of-the-art simulator in the literature, bringing down the simulation time from 360 hours to a single minute. This powerful simulation tool not only enables us to efficiently generate as many OCT images of objects with arbitrary structure and shape as we want on a common desktop computer, but it also provides us the underlying ground-truth of the simulated images at the same time because we dictate them at the beginning of the simulation. This is one of the key contributions of this thesis. What allows us to build such a powerful simulation tool includes a thorough understanding of the signal formation process, clever implementation of the importance sampling/photon splitting procedure, efficient use of a voxel-based mesh system in determining photon-mesh interception, and a parallel computation of different A-scans that consist a full OCT image, among other programming and mathematical tricks, which will be explained in detail later in the thesis.
Next we aim at the inverse problem: given an OCT image, predict/reconstruct its ground-truth structure on a pixel level. By solving this problem we would be able to interpret an OCT image completely and precisely without the help from a trained expert. It turns out that we can do much better. For simple structures we are able to reconstruct the ground-truth of an OCT image more than 98% correctly, and for more complicated structures (e.g., a multi-layered brain structure) we are looking at 93%. We achieved this through extensive uses of Machine Learning. The success of the Monte Carlo simulation already puts us in a great position by providing us with a great deal of data (effectively unlimited), in the form of (image, truth) pairs. Through a transformation of the high-dimensional response variable, we convert the learning task into a multi-output multi-class classification problem and a multi-output regression problem. We then build a hierarchy architecture of machine learning models (committee of experts) and train different parts of the architecture with specifically designed data sets. In prediction, an unseen OCT image first goes through a classification model to determine its structure (e.g., the number and the types of layers present in the image); then the image is handed to a regression model that is trained specifically for that particular structure to predict the length of the different layers and by doing so reconstruct the ground-truth of the image. We also demonstrate that ideas from Deep Learning can be useful to further improve the performance.
It is worth pointing out that solving the inverse problem automatically improves the imaging depth, since previously the lower half of an OCT image (i.e., greater depth) can be hardly seen but now becomes fully resolved. Interestingly, although OCT signals consisting the lower half of the image are weak, messy, and uninterpretable to human eyes, they still carry enough information which when fed into a well-trained machine learning model spits out precisely the true structure of the object being imaged. This is just another case where Artificial Intelligence (AI) outperforms human. To the best knowledge of the author, this thesis is not only a success but also the first attempt to reconstruct an OCT image at a pixel level. To even give a try on this kind of task, it would require fully annotated OCT images and a lot of them (hundreds or even thousands). This is clearly impossible without a powerful simulation tool like the one developed in this thesis.
Resumo:
Technological innovation has made it possible to grow marine finfish in the coastal and open ocean. Along with this opportunity comes environmental risk. As a federal agency charged with stewardship of the nation’s marine resources, the National Oceanic and Atmospheric Administration (NOAA) requires tools to evaluate the benefits and risks that aquaculture poses in the marine environment, to implement policies and regulations which safeguard our marine and coastal ecosystems, and to inform production designs and operational procedures compatible with marine stewardship. There is an opportunity to apply the best available science and globally proven best management practices to regulate and guide a sustainable United States (U.S.) marine finfish farming aquaculture industry. There are strong economic incentives to develop this industry, and doing so in an environmentally responsible way is possible if stakeholders, the public and regulatory agencies have a clear understanding of the relative risks to the environment and the feasible solutions to minimize, manage or eliminate those risks. This report spans many of the environmental challenges that marine finfish aquaculture faces. We believe that it will serve as a useful tool to those interested in and responsible for the industry and safeguarding the health, productivity and resilience of our marine ecosystems. This report aims to provide a comprehensive review of some predominant environmental risks that marine fish cage culture aquaculture, as it is currently conducted, poses in the marine environment and designs and practices now in use to address these environmental risks in the U.S. and elsewhere. Today’s finfish aquaculture industry has learned, adapted and improved to lessen or eliminate impacts to the marine habitats in which it operates. What progress has been made? What has been learned? How have practices changed and what are the results in terms of water quality, benthic, and other environmental effects? To answer these questions we conducted a critical review of the large body of scientific work published since 2000 on the environmental impacts of marine finfish aquaculture around the world. Our report includes results, findings and recommendations from over 420 papers, primarily from peer-reviewed professional journals. This report provides a broad overview of the twenty-first century marine finfish aquaculture industry, with a targeted focus on potential impacts to water quality, sediment chemistry, benthic communities, marine life and sensitive habitats. Other environmental issues including fish health, genetic issues, and feed formulation were beyond the scope of this report and are being addressed in other initiatives and reports. Also absent is detailed information about complex computer simulations that are used to model discharge, assimilation and accumulation of nutrient waste from farms. These tools are instrumental for siting and managing farms, and a comparative analysis of these models is underway by NOAA.
Resumo:
A survey on technology planning and its implications for a useful tool catalogue for technology management was conducted. The survey provided a picture of technology planning, across a broad range of company size, manufacturing type and sector. It was concluded from the findings that technology planning is an important business activity across industry sectors and company types, driven increasing competition, market requirements and regulation technology change. The process technology roadmapping was used to support technology strategy and planning and could be useful way of structuring both the use of tools in a company and a tool catalogue.
Resumo:
This paper presents an investigation into the losses in a three-phase induction motor under different pulse width modulation (PWM) excitation conditions. The impacts of Sinusoidal PWM, Space Vector PWM and Discontinuous PWM on machine loss are compared and studied. Finite element analysis simulations are employed to predict the machine losses with the loss breakdown analysis under different PWM schemes. Direct Calorimetric measurements are utilized to verify the finite element modeling and provide direct quantifications of machine loss under modern PWM techniques. © 2008 IEEE.
Resumo:
Innovation policies play an important role throughout the development process of emerging industries. However, existing policy studies view the process as a black-box, and fail to understand the policy-industry interactions through the process. This paper aims to develop an integrated technology roadmapping tool, in order to facilitate the better understanding of policy heterogeneity at the different stages of new energy industries in China. Through the case study of Chinese wind energy equipment manufacturing industry, this paper elaborates the dynamics between policy and the growth process of the industry. Further, this paper generalizes some Chinese specifics for the policy-industry interactions. As a practical output, this study proposes a policy-technology roadmapping framework that maps policy-market-product- technology interactions in response to the requirement for analyzing and planning the development of new industries in emerging economies (e.g. China). This paper will be of interest to policy makers, strategists, investors, and industrial experts. © 2011 IEEE.
Resumo:
Innovation policies play an important role throughout the development process of emerging industries in China. Existing policy and industry studies view the emergence process as a black-box, and fail to understand the impacts of policy to the process along which it varies. This paper aims to develop a multi-dimensional roadmapping tool to better analyse the dynamics between policy and industrial growth for new industries in China. Through reviewing the emergence process of Chinese wind turbine industry, this paper elaborates how policy and other factors influence the emergence of this industry along this path. Further, this paper generalises some Chinese specifics for the policy-industry dynamics. As a practical output, this study proposes a roadmapping framework that generalises some patterns of policy-industry interactions for the emergence process of new industries in China. This paper will be of interest to policy makers, strategists, investors and industrial experts. Copyright © 2013 Inderscience Enterprises Ltd.