992 resultados para Machine theory.
Resumo:
Tämän diplomityön tavoitteena oli tutustua ohutlevyjen nykyaikaisiin koneellisiin leikkausmenetelmiin ja tutkia niiden soveltuvuutta yrityksen tarpeisiin. Kohdeyrityksessä investoinnin tarve jakautui tuottavuusinvestoinnin, korvausinvestoinnin ja strategisen investoinnin kesken. Tavoitteena oli luoda investointipolku, jonka avulla poissuljettiin menetelmät, jotka eivät soveltuneet yrityksen tuotantoon. Työn kirjallisuusosuudessa tarkastellaan teoriatietoja, jotka liittyvät yleisesti nykyaikaisiin ohutlevyjen leikkausmenetelmiin sekä investointiprojektin suunnitteluun ja toteutuksen teoriaan. Lisäksi käsitellään investointeihin liittyviä kannattavuus- ja kustannuslaskennan perusperiaatteita. Työn empiirisessä osassa selvitettiin yrityksen ohutlevyosien valmistuksen periaatteita nykytila-analyysin avulla. Tämän perusteella määritettiin nykyaikaisista markkinoilla olevista menetelmistä yritykselle soveltuvin. Tutkimuksen perusteella laserleikkaus oli menetelmistä soveltuvin. Perusinvestoinniltaan laser oli vaihtoehtoisista menetelmistä kallein, mutta se soveltui käytettävyyden, tehokkuuden, joustavuuden ja muiden ominaisuuksiensa perusteella parhaiten tuotannon tarpeisiin. Työn merkittävimmät tulokset osoittivat, että investoinnin kannattavuus riippui koneelle saatavasta käyttösuhteesta. Uusien koneiden tehokkuus lyhentäisi tuotannon läpimenoaikoja, mutta ilman riittävää kapasiteetin käyttöastetta kappaleiden omakustannusarvo nousisi. Lopputulokset ja suositukset on esitetty työn lopussa.
Resumo:
The topic of this thesis is the simulation of a combination of several control and data assimilation methods, meant to be used for controlling the quality of paper in a paper machine. Paper making is a very complex process and the information obtained from the web is sparse. A paper web scanner can only measure a zig zag path on the web. An assimilation method is needed to process estimates for Machine Direction (MD) and Cross Direction (CD) profiles of the web. Quality control is based on these measurements. There is an increasing need for intelligent methods to assist in data assimilation. The target of this thesis is to study how such intelligent assimilation methods are affecting paper web quality. This work is based on a paper web simulator, which has been developed in the TEKES funded MASI NoTes project. The simulator is a valuable tool in comparing different assimilation methods. The thesis contains the comparison of four different assimilation methods. These data assimilation methods are a first order Bayesian model estimator, an ARMA model based on a higher order Bayesian estimator, a Fourier transform based Kalman filter estimator and a simple block estimator. The last one can be considered to be close to current operational methods. From these methods Bayesian, ARMA and Kalman all seem to have advantages over the commercial one. The Kalman and ARMA estimators seems to be best in overall performance.
Resumo:
Diplomityössä kehitetään ABB Oy Drives:lle menetelmää, jolla voidaan ennustaa ohutlevyosien ja niistä koostuvien kokoonpanojen hintaa ilman tarkkaa valmistuksellista geometriatietoa. Työ on osa Tekesin rahoittamaa Piirre 2.0 -projektia. Työn teoriaosa määrittelee lyhyesti ohutlevytuotteet ja niiden valmistusmenetelmät. Laajemmassa teoriatarkastelussa ovat erilaiset ohutlevytuotteiden valmistuskustannusten ennustamismenetelmät regressioanalyysin käyttöön painottuen. Käytännön osiossa määritetään Finn-Power LP6 -levytyökeskuksen suorituskyky ja muodostetaan työaikalaskuri kerättyyn tietoon perustuen. Lisäksi muodostetaan regressioanalyysit kahden eri alihankkijan valmistamien ohutlevytuotteiden pohjalta. Regressiotekniikoiden avulla etsitään kustannuksiin voimakkaasti vaikuttavat parametrit ja muodostetaan laskukaava valmistuskustannusten ennustamiseen. Lopuksi vertaillaan teorian ja käytännön osien yhteensopivuutta ja etsitään syitä havaittuihin eroihin. Tutkimustulosten hyödyntämismahdollisuuksien ohella esitetään myös eräitä jatkokehitysehdotuksia.
Resumo:
Tässä diplomityössä esitellään robotisoinnin teoria ja robottisolun oheislaitteet, vaihtoehdot, toiminta ja turvallisuus. Joustavalla tuotantosolulla tarkoitetaan automaattista valmistusjärjestelmää, jossa on useita toisiinsa liitettyjä koneita yhteisellä ohjausjärjestelmällä. Solun komponentteja ovat tuotantolaitteet ja -koneet, ohjausjärjestelmät, valvontalaitteet ja anturit, toimi- ja säätölaitteet sekä ohjelmointijärjestelmä. Joustavaa tuotantosolua voidaan kehittää tehostamalla työntekijöiden koulutusta, ohjelmointia, asetusten tekemistä ja layouttia. Diplomityöhön liittyvä kehitystehtävä koskee heinäveteläisen alihankintakonepajan, Metalliset Oy:n, tuotannon tehostamista robotisoinnin avulla. Metalliset Oy:n joustavan särmäyssolun kehittämiskeinoiksi valittiin nykyisen järjestelmän tehostaminen, uuden tuotantosolun luominen olemassa olevia laitteita hyödyntäen ja tulevaisuuden tuotantosolun kehittäminen. Vertailussa parhaimmaksi osoittautui uuden tuotantosolun luominen olemassa olevia laitteita hyödyntäen.
Resumo:
The aim of this thesis is to describe hybrid drive design problems, the advantages and difficulties related to the drive. A review of possible hybrid constructions, benefits of parallel, series and series-parallel hybrids is done. In the thesis analytical and finite element calculations of permanent magnet synchronous machines with embedded magnets were done. The finite element calculations were done using Cedrat’s Flux 2D software. This machine is planned to be used as a motor-generator in a low power parallel hybrid vehicle. The boundary conditions for the design were found from Lucas-TVS Ltd., India. Design Requirements, briefly: • The system DC voltage level is 120 V, which implies Uphase = 49 V (RMS) in a three phase system. • The power output of 10 kW at base speed 1500 rpm (Torque of 65 Nm) is desired. • The maximum outer diameter should not be more than 250 mm, and the maximum core length should not exceed 40 mm. The main difficulties which the author met were the dimensional restrictions. After having designed and analyzed several possible constructions they were compared and the final design selected. Dimensioned and detailed design is performed. Effects of different parameters, such as the number of poles, number of turns and magnetic geometry are discussed. The best modification offers considerable reduction of volume.
Resumo:
The theory part of the Master’s thesis introduces fibres with high tensile strength and elongation used in the production of paper or board. Strong speciality papers are made of bleached softwood long fibre pulp. The aim of the thesis is to find new fibres suitable for paper making to increase either tensile strength, elongation or both properties. The study introduces how fibres bond and what kind of fibres give the strongest bonds into fibre matrix. The fibres that are used the in manufacturing of non-wovens are long and elastic. They are longer than softwood cellulose fibres. The end applications of non-wovens and speciality papers are often the same, for instance, wet napkins or filter media. The study finds out which fibres are used in non-wovens and whether the same fibres could be added to cellulose pulp as armature fibres, what it would require for these fibres to be blended in cellulose, how they would bind with cellulose and whether some binding agents or thermal bonding, such as hot calendaring would be necessary. The following fibres are presented: viscose, polyester, nylon, polyethylene, polypropylene and bicomponent fibres. In the empiric part of the study the most suitable new fibres are selected for making hand sheets in laboratory. Test fibres are blended with long fibre cellulose. The test fibres are viscose (Tencel), polypropylene and polyethylene. Based on the technical values measured in the sheets, the study proposes how to continue trials on paper machine with viscose, polyester, bicomponent and polypropylene fibres.
Resumo:
Quantum Chemical calculations for group 14 elements of Periodic Table (C, Si, Ge, Sn, Pb) and their functional groups have been carried out using Density Functional Theory (DFT) based reactivity descriptors such as group electronegativities, hardness and softness. DFT calculations were performed for a large series of tetracoordinated Sn compounds of the CH3SnRR'X type, where X is a halogen and R and R' are alkyl, halogenated alkyl, alkoxy, or alkyl thio groups. The results were interpreted in terms of calculated electronegativity and hardness of the SnRR'X groups, applying a methodology previously developed by Geerlings and coworkers (J. Phys. Chem. 1993, 97, 1826). These calculations allowed to see the regularities concerning the influence of the nature of organic groups RR' and inorganic group X on electronegativities and hardness of the SnRR'X groups; in this case, it was found a very good correlation between the electronegativity of the fragment and experimental 119Sn chemical shifts, a property that sensitively reflects the change in the valence electronic structure of molecules. This work was complemented with the study of some compounds of the EX and ER types, where E= C, Si, Ge, Sn and R= CH3, H, which was performed to study the influence that the central atom has on the electronegativity and hardness of molecules, or whether these properties are mainly affected for the type of ligand bound to the central atom. All these calculations were performed using the B3PW91 functional together with the 6-311++G** basis set level for H, C, Si, Ge, F, Cl and Br atoms and the 3-21G for Sn and I atoms.
Resumo:
Leibniz's conception of bodies seems to be a puzzling theory. Bodies are seen as aggregates of monads and as wellfounded phenomena. This has initiated controversy and unending discussions. The paper attempts to resolve the apparent inconsistencies by a new and formally spirited reconstruction of Leibniz's theory of monads and perception, on the one hand, and a (re-)formulation and precisation of his concept of preestablished harmony, on the other hand. Preestablished harmony is modelled basically as a covariation between the monadic and the ideal realm.
Resumo:
According to the theory of language of the young Benjamin, the primary task of language isn't the communication of contents, but to express itself as a "spiritual essence" in which also men take part. That conception according to which language would be a medium to signification of something outside it leads to a necessary decrease of its original strength and is thus denominated by Benjamin bürgerlich. The names of human language are remainders of an archaic state, in which things weren't yet mute and had their own language. Benjamin suggests also that all the arts remind the original language of things, as they make objects "speak" in form of sounds, colors, shapes etc. That relationship between arts as reminders of the "language of things" and the possible reconciliation of mankind with itself and with nature has been developed by Theodor Adorno in several of his writings, specially in the Aesthetic Theory, where the artwork is ultimately conceived as a construct pervaded by "language" in the widest meaning - not in the "bourgeois" sense.
Resumo:
As a discipline, logic is arguably constituted of two main sub-projects: formal theories of argument validity on the basis of a small number of patterns, and theories of how to reduce the multiplicity of arguments in non-logical, informal contexts to the small number of patterns whose validity is systematically studied (i.e. theories of formalization). Regrettably, we now tend to view logic 'proper' exclusively as what falls under the first sub-project, to the neglect of the second, equally important sub-project. In this paper, I discuss two historical theories of argument formalization: Aristotle's syllogistic theory as presented in the "Prior Analytics", and medieval theories of supposition. They both illustrate this two-fold nature of logic, containing in particular illuminating reflections on how to formalize arguments (i.e. the second sub-project). In both cases, the formal methods employed differ from the usual modern technique of translating an argument in ordinary language into a specially designed symbolism, a formal language. The upshot is thus a plea for a broader conceptualization of what it means to formalize.
Resumo:
Tutkielmassa eritellään Norman Faircloughin kriittisen diskurssianalyysin teoriaa ja siihen kohdistettua kritiikkiä. Pyrkimyksenä on sovittaa näitä erilaisia näkemyksiä keskenään ja tarjota ratkaisuja yhteen kiriittisen diskurssianalyysin keskeiseen ongelmaan eli emansipaation (sosiaalisten epäkohtien tunnistamisen ja ratkaisemisen) puutteellisuuteen. Teoriaosuudesta esiin nousevia mahdollisuuksia sovelletaan tekstianalyysiin. Tutkimuksen kohteena on teksti Rebuilding America’s Defenses: Strategy, Forces and Resources For a New Century ja jossain määrin sen tuottanut järjestö Project for the New American Century. Näitä tarkastellaan ennen kaikkea sosiaalisina ilmiöinä ja suhteessa toisiinsa. Faircloughin mallin suurimmiksi ongelmiksi muodostuvat perinteinen käsitys kielestä, jonka mukaan kielen järjestelmän abstraktit ja sisäiset suhteet ovat tärkeimpiä, sekä ideologinen vastakkainasettelu kritiikin lähtökohtana. Ensimmäinen johtaa kielellisten tutkimustulosten epätyydyttävään kykyyn selittää sosiaalisia havaintoja ja jälkimmäinen poliittiseen tai maailmankatsomukselliseen väittelyyn, joka ei mahdollista uusia näkemyksiä. Tutkielman lopputulema on, että keskittymällä asiasisältöön kielen rakenteen sijasta ja ymmärtämällä tekstin tuottaja yksittäisenä, rajattuna sosiaalisena toimijana voidaan analyysiin saada avoimuutta ja täsmällisyyttä. Kriittiinen diskurssianalyysi kaipaa tällaista näkemystä kielellisten analyysien tueksi ja uudenlaisen relevanssin löytääkseen.
Resumo:
The Switched Reluctance technology is probably best suited for industrial low-speed or zerospeed applications where the power can be small but the torque or the force in linear movement cases might be relatively high. Because of its simple structure the SR-motor is an interesting alternative for low power applications where pneumatic or hydraulic linear drives are to be avoided. This study analyses the basic parts of an LSR-motor which are the two mover poles and one stator pole and which form the “basic pole pair” in linear-movement transversal-flux switchedreluctance motors. The static properties of the basic pole pair are modelled and the basic design rules are derived. The models developed are validated with experiments. A one-sided one-polepair transversal-flux switched-reluctance-linear-motor prototype is demonstrated and its static properties are measured. The modelling of the static properties is performed with FEM-calculations. Two-dimensional models are accurate enough to model the static key features for the basic dimensioning of LSRmotors. Three-dimensional models must be used in order to get the most accurate calculation results of the static traction force production. The developed dimensioning and modelling methods, which could be systematically validated by laboratory measurements, are the most significant contributions of this thesis.
Resumo:
This paper is a historical companion to a previous one, in which it was studied the so-called abstract Galois theory as formulated by the Portuguese mathematician José Sebastião e Silva (see da Costa, Rodrigues (2007)). Our purpose is to present some applications of abstract Galois theory to higher-order model theory, to discuss Silva's notion of expressibility and to outline a classical Galois theory that can be obtained inside the two versions of the abstract theory, those of Mark Krasner and of Silva. Some comments are made on the universal theory of (set-theoretic) structures.
Resumo:
ABSTRACT When Hume, in the Treatise on Human Nature, began his examination of the relation of cause and effect, in particular, of the idea of necessary connection which is its essential constituent, he identified two preliminary questions that should guide his research: (1) For what reason we pronounce it necessary that every thing whose existence has a beginning should also have a cause and (2) Why we conclude that such particular causes must necessarily have such particular effects? (1.3.2, 14-15) Hume observes that our belief in these principles can result neither from an intuitive grasp of their truth nor from a reasoning that could establish them by demonstrative means. In particular, with respect to the first, Hume examines and rejects some arguments with which Locke, Hobbes and Clarke tried to demonstrate it, and suggests, by exclusion, that the belief that we place on it can only come from experience. Somewhat surprisingly, however, Hume does not proceed to show how that derivation of experience could be made, but proposes instead to move directly to an examination of the second principle, saying that, "perhaps, be found in the end, that the same answer will serve for both questions" (1.3.3, 9). Hume's answer to the second question is well known, but the first question is never answered in the rest of the Treatise, and it is even doubtful that it could be, which would explain why Hume has simply chosen to remove any mention of it when he recompiled his theses on causation in the Enquiry concerning Human Understanding. Given this situation, an interesting question that naturally arises is to investigate the relations of logical or conceptual implication between these two principles. Hume seems to have thought that an answer to (2) would also be sufficient to provide an answer to (1). Henry Allison, in his turn, argued (in Custom and Reason in Hume, p. 94-97) that the two questions are logically independent. My proposal here is to try to show that there is indeed a logical dependency between them, but the implication is, rather, from (1) to (2). If accepted, this result may be particularly interesting for an interpretation of the scope of the so-called "Kant's reply to Hume" in the Second Analogy of Experience, which is structured as a proof of the a priori character of (1), but whose implications for (2) remain controversial.