985 resultados para Machine-tool
Resumo:
Background Analysing the observed differences for incidence or mortality of a particular disease between two different situations (such as time points, geographical areas, gender or other social characteristics) can be useful both for scientific or administrative purposes. From an epidemiological and public health point of view, it is of great interest to assess the effect of demographic factors in these observed differences in order to elucidate the effect of the risk of developing a disease or dying from it. The method proposed by Bashir and Estève, which splits the observed variation into three components: risk, population structure and population size is a common choice at practice. Results A web-based application, called RiskDiff has been implemented (available at http://rht.iconcologia.net/riskdiff.htm webcite), to perform this kind of statistical analyses, providing text and graphical summaries. Code from the implemented functions in R is also provided. An application to cancer mortality data from Catalonia is used for illustration. Conclusions Combining epidemiological with demographical factors is crucial for analysing incidence or mortality from a disease, especially if the population pyramids show substantial differences. The tool implemented may serve to promote and divulgate the use of this method to give advice for epidemiologic interpretation and decision making in public health.
Resumo:
This paper aims to present an ePortfolio project led for two years in a multilingual and interdisciplinary Master's program in public discourse and communication analysis offered by the Faculty of Arts of the University of Lausanne (Switzerland). Globally, the project - named Learn to communicate skills - offers a reflection about academic skills and their transferability to the professional world. More precisely, the aim of the project is to make students aware of the importance of reflexive learning to make their skills transferable to other contexts.
Resumo:
Työn tavoitteena oli kehittää metalliteollisuusyrityksen tarpeisiin menetelmä jolla alihankintaa kehitetään kustannustehokkuustavoitteita vastaavaksi. Selkeillä ja avoimilla toimintaperiaatteilla pyritään luomaan edellytykset alihankintaverkoston yritysten kehitykselle ja tehokkaalle osto- ja tuotantopäätöstoiminnalle. Tarve kehittää konepajayrityksen alihankintana teetettävää osavalmistusta on syntynyt kasvu- ja tulostavoitteista. Työn toimeksiantaja, metsäteollisuuden koneenrakennuksen lopputuotevalmistaja, pyrkii selkeyttämään sisäistä tuotantoaan ja nostamaan kapasiteettiaan valmiiden tuotteiden osalta. Näihin tavoitteisiin olennaisena osana kuuluu ydinosaamiseen keskittyminen ja alihankinnan suhteellinen lisääminen. Näin ollen alihankintaprosesseja on kehitettävä toimiviksi ja tehokkaiksi. Työssä on laadittu yrityksen tarpeisiin soveltuva toimintamalli, tavoiteasetannan mukaiseen alihankinnan kehittämiseen. Alihankinnan keskittäminen tuoteryhmittäin on pitkällä aikavälillä molempia osapuolia hyödyttävä valinta. Mallin toimenpiteiden avulla alihankinnan keskittämisellä saavutetaan etuja kustannustehokkuudessa, joustavuudessa sekä alihankintakentän selkeytymisessä.
Resumo:
Työssä analysoidaanprosessin vaikutusta paperikoneen stabiiliuteen. Kaksi modernia sanomalehtipaperikonetta analysoitiin ja sen perusteella molemmista prosesseista rakennettiin fysiikan lakeihin perustuvat simulointimallit APROS Paper simulointiohjelmistolla. Työn tavoitteena on selvittää, miten kyseisten koneiden prosessit eroavat toisistaan ja arvioida, miten havaitut erot vaikuttavat prosessien stabiiliuteen. Työssä tarkastellaan periodisten häiriöiden vaimenemista prosessissa. Simuloinnissa herätteenä käytettiin puhdasta valkoista kohinaa, jonka avulla eri taajuistenperiodisten häiriöiden vaimenemista analysoitiin. Prosessien häiriövasteet esitetään taajuuskoordinaatistossa. Suurimmat erot prosessien välillä löytyivät viirakaivosta ja sen sekoitusdynamiikasta. Perinteisen viirakaivon todettiin muistuttavan käyttäytymiseltään sarjaan kytkettyjä ideaalisekoittimia, kun taas pienempitilavuuksisen fluumin todettiin käyttäytyvän lähes kuin putkiviive. Vaikka erotprosessitilavuudessa sekä viirakaivon sekoitusdynamiikassa olivat hyvin selkeät, havaittiin vain marginaalinen ero prosessin välillä periodisten häiriöiden vaimenemisessa, koska erot viiraretentiotasoissa vaikuttivat eniten simulointituloksia. Matalammalla viiraretentiolla operoivan paperikoneen todettiin vaimentavan tehokkaammin prosessihäiriöitä. Samalla retentiotasolla pienempitilavuuksisen prosessin todettiin vaimentavan hitaita prosessihäiriöitä marginaalisesti paremmin. Tutkituista paperikoneista toisella simuloitiin viiraosan vedenpoistomuutoksenvaikutusta viiraretentioon ja paperin koostumukseen. Lisäksi arvioitiin viiraretention säädön toimivuutta. Viiraosan listakengän vedenpoiston todettiin aiheuttavan merkittäviä sakeus- ja retentiohäiriöitä, mikäli sen avulla poistettavan kiintoaineen virtaus tuplaantuisi. Viiraretention säädön todettiin estävän häiriöiden kierron prosessissa, mutta siirtävän ne suoraan rainaan. Retention säädön eikuitenkaan todettu olevan suoranainen häiriön lähde.
Resumo:
Prosessisimulointiohjelmistojen käyttö on yleistynyt paperiteollisuuden prosessien kartoituksessa ja kyseiset ohjelmistot ovat jo pitkään olleet myös Pöyry Engineering Oy:n työkaluja prosessisuunnittelussa. Tämän työn tavoitteeksi määritettiin prosessisimulointiohjelmistojen käytön selvittäminen suomalaisissa paperitehtaissa sekä prosessisimuloinnin tulevaisuuden näkymien arviointi metsäteollisuuden suunnittelupalveluissa liiketoiminnan kehittämiseksi. Työn teoriaosassa selvitetään mm. seuraavia asioita: mitä prosessisimulointi on, miksi simuloidaan ja mitkä ovat simuloinnin hyödyt ja haasteet. Teoriaosassa esitellään yleisimmät käytössä olevat prosessisimulointiohjelmistot, simulointiprosessin eteneminen sekä prosessisimuloinnin tuotteistamisen vaatimuksia. Työn kokeellisessa osassa selvitettiin kyselyn avulla prosessisimulointiohjelmistojen käyttöä Suomen paperitehtaissa. Kysely lähetettiin kaikille Suomen tärkeimmille paperitehtaille. Kyselyn avulla selvitettiin mm, mitä ohjelmia käytetään, mitä on simuloitu, mitä pitää vielä simuloida ja kuinka tarpeellisena prosessisimulointia pidetään. Työntulokset osoittavat, että kaikilla kyselyyn vastanneilla suomalaisilla paperitehtailla on käytetty prosessisimulointia. Suurin osa simuloinneista on tehty konelinjoihin sekä massa- ja vesijärjestelmiin. Tulevaisuuden tärkeimpänä kohteena pidetään energiavirtojen simulointia. Simulointimallien pitkäjänteisessä hyödyntämisessä ja ylläpidossa on kehitettävää, jossa simulointipalvelujen hankkiminen palveluna on tehtaille todennäköisin vaihtoehto. Johtopäätöksenä on se, että tehtailla on tarvetta prosessisimuloinnille. Ilmapiiri on kyselytuloksien mukaan suotuisa ja simulointi nähdään tarpeellisena työkaluna. Prosessisimuloinnin markkinointia, erillispalvelutuotteen lisäksi, kannattaisi kehittää siten, että simulointimallin ylläpito jatkuisi projektin jälkeen lähipalveluna. Markkinointi pitäisi tehdä jo projektin alkuvaiheessa tai projektin aikana. Simulointiohjelmien kirjosta suunnittelutoimiston kannattaa valita simulointiohjelmistoja, jotka sopivat sille parhaiten. Erityistapauksissa muiden ohjelmien hankintaa kannattaa harkita asiakkaan toivomusten mukaisesti.
Resumo:
Although severe patient-ventilator asynchrony is frequent during invasive and non-invasive mechanical ventilation, diagnosing such asynchronies usually requires the presence at the bedside of an experienced clinician to assess the tracings displayed on the ventilator screen, thus explaining why evaluating patient-ventilator interaction remains a challenge in daily clinical practice. In the previous issue of Critical Care, Sinderby and colleagues present a new automated method to detect, quantify, and display patient-ventilator interaction. In this validation study, the automatic method is as efficient as experts in mechanical ventilation. This promising system could help clinicians extend their knowledge about patient-ventilator interaction and further improve assisted mechanical ventilation.
Resumo:
Référence bibliographique : Rol, 58923
Resumo:
Référence bibliographique : Rol, 58924
Resumo:
Dissolved organic matter (DOM) is a complex mixture of organic compounds, ubiquitous in marine and freshwater systems. Fluorescence spectroscopy, by means of Excitation-Emission Matrices (EEM), has become an indispensable tool to study DOM sources, transport and fate in aquatic ecosystems. However the statistical treatment of large and heterogeneous EEM data sets still represents an important challenge for biogeochemists. Recently, Self-Organising Maps (SOM) has been proposed as a tool to explore patterns in large EEM data sets. SOM is a pattern recognition method which clusterizes and reduces the dimensionality of input EEMs without relying on any assumption about the data structure. In this paper, we show how SOM, coupled with a correlation analysis of the component planes, can be used both to explore patterns among samples, as well as to identify individual fluorescence components. We analysed a large and heterogeneous EEM data set, including samples from a river catchment collected under a range of hydrological conditions, along a 60-km downstream gradient, and under the influence of different degrees of anthropogenic impact. According to our results, chemical industry effluents appeared to have unique and distinctive spectral characteristics. On the other hand, river samples collected under flash flood conditions showed homogeneous EEM shapes. The correlation analysis of the component planes suggested the presence of four fluorescence components, consistent with DOM components previously described in the literature. A remarkable strength of this methodology was that outlier samples appeared naturally integrated in the analysis. We conclude that SOM coupled with a correlation analysis procedure is a promising tool for studying large and heterogeneous EEM data sets.
Resumo:
ABSTRACT:¦BACKGROUND: The Spiritual Distress Assessment Tool (SDAT) is a 5-item instrument developed to assess unmet spiritual needs in hospitalized elderly patients and to determine the presence of spiritual distress. The objective of this study was to investigate the SDAT psychometric properties.¦METHODS: This cross-sectional study was performed in a Geriatric Rehabilitation Unit. Patients (N = 203), aged 65 years and over with Mini Mental State Exam score ≥ 20, were consecutively enrolled over a 6-month period. Data on health, functional, cognitive, affective and spiritual status were collected upon admission. Interviews using the SDAT (score from 0 to 15, higher scores indicating higher distress) were conducted by a trained chaplain. Factor analysis, measures of internal consistency (inter-item and item-to-total correlations, Cronbach α), and reliability (intra-rater and inter-rater) were performed. Criterion-related validity was assessed using the Functional Assessment of Chronic Illness Therapy-Spiritual well-being (FACIT-Sp) and the question "Are you at peace?" as criterion-standard. Concurrent and predictive validity were assessed using the Geriatric Depression Scale (GDS), occurrence of a family meeting, hospital length of stay (LOS) and destination at discharge.¦RESULTS: SDAT scores ranged from 1 to 11 (mean 5.6 ± 2.4). Overall, 65.0% (132/203) of the patients reported some spiritual distress on SDAT total score and 22.2% (45/203) reported at least one severe unmet spiritual need. A two-factor solution explained 60% of the variance. Inter-item correlations ranged from 0.11 to 0.41 (eight out of ten with P < 0.05). Item-to-total correlations ranged from 0.57 to 0.66 (all P < 0.001). Cronbach α was acceptable (0.60). Intra-rater and inter-rater reliabilities were high (Intraclass Correlation Coefficients ranging from 0.87 to 0.96). SDAT correlated significantly with the FACIT-Sp, "Are you at peace?", GDS (Rho -0.45, -0.33, and 0.43, respectively, all P < .001), and LOS (Rho 0.15, P = .03). Compared with patients showing no severely unmet spiritual need, patients with at least one severe unmet spiritual need had higher odds of occurrence of a family meeting (adjOR 4.7, 95%CI 1.4-16.3, P = .02) and were more often discharged to a nursing home (13.3% vs 3.8%; P = .027).¦CONCLUSIONS: SDAT has acceptable psychometrics properties and appears to be a valid and reliable instrument to assess spiritual distress in elderly hospitalized patients.
Resumo:
The aim of this study is to present an Activity-Based Costing spreadsheet tool for analyzing the logistics costs. The tool can be used both by customer-companies and logistics service providers. The study discusses the influence of different activity models on costs. Additionally this paper discusses about the logistical performance across the total supply chain This study is carried out using ananalytical research approach and literature material has been used for supplementing the concerned research approach. Cost structure analysis was based on the theory of activity-based management. This study was outlined to spare part logistics in machine-shop industry. The outlines of logistics services and logisticalperformance discussed in this report are based on the new logistics business concept (LMS-concept), which has been presented earlier in the Valssi-project. Oneof the aims of this study is to increase awareness of different activity modelson logistics costs. The report paints an overall picture about the business environment and requirements for the new logistics concept.
Resumo:
In this thesis membrane filtration of paper machnie clear filtrate was studied. The aim of the study was to find membrane processes which are able to produce economically water of sufficient purity from paper machine white water or its saveall clarified fractions for reuse in the paper machnie short circulation. Factors affecting membrane fouling in this application were also studied. The thesis gives an overview af experiments done on a laboratory and a pilot scale with several different membranes and membrane modules. The results were judged by the obtained flux, the fouling tendency and the permeate quality assessed with various chemical analyses. It was shown that membrane modules which used a turbulence promotor of some kind gave the highest fluexes. However, the results showed that the greater the reduction in the concentration polarisation layer caused by increased turbulence in the module, the smaller the reductions in measured substances. Out of the micro-, ultra- and nanofiltration membranes tested, only nanofiltration memebranes produced permeate whose quality was very close to that of the chemically treated raw water used as fresh water in most paper mills today and which should thus be well suited for reuse as shower water both in the wire and press section. It was also shown that a one stage nanofiltration process was more effective than processes in which micro- or ultrafiltration was used as pretreatment for nanofiltration. It was generally observed that acidic pH, high organic matter content, the presence of multivalent ions, hydrophobic membrane material and high membrane cutoff increased the fouling tendency of the membranes.
Resumo:
This work deals with the cooling of high-speed electric machines, such as motors and generators, through an air gap. It consists of numerical and experimental modelling of gas flow and heat transfer in an annular channel. Velocity and temperature profiles are modelled in the air gap of a high-speed testmachine. Local and mean heat transfer coefficients and total friction coefficients are attained for a smooth rotor-stator combination at a large velocity range. The aim is to solve the heat transfer numerically and experimentally. The FINFLO software, developed at Helsinki University of Technology, has been used in the flow solution, and the commercial IGG and Field view programs for the grid generation and post processing. The annular channel is discretized as a sector mesh. Calculation is performed with constant mass flow rate on six rotational speeds. The effect of turbulence is calculated using three turbulence models. The friction coefficient and velocity factor are attained via total friction power. The first part of experimental section consists of finding the proper sensors and calibrating them in a straight pipe. After preliminary tests, a RdF-sensor is glued on the walls of stator and rotor surfaces. Telemetry is needed to be able to measure the heat transfer coefficients at the rotor. The mean heat transfer coefficients are measured in a test machine on four cooling air mass flow rates at a wide Couette Reynolds number range. The calculated values concerning the friction and heat transfer coefficients are compared with measured and semi-empirical data. Heat is transferred from the hotter stator and rotor surfaces to the coolerair flow in the air gap, not from the rotor to the stator via the air gap, althought the stator temperature is lower than the rotor temperature. The calculatedfriction coefficients fits well with the semi-empirical equations and precedingmeasurements. On constant mass flow rate the rotor heat transfer coefficient attains a saturation point at a higher rotational speed, while the heat transfer coefficient of the stator grows uniformly. The magnitudes of the heat transfer coefficients are almost constant with different turbulence models. The calibrationof sensors in a straight pipe is only an advisory step in the selection process. Telemetry is tested in the pipe conditions and compared to the same measurements with a plain sensor. The magnitudes of the measured data and the data from the semi-empirical equation are higher for the heat transfer coefficients than thenumerical data considered on the velocity range. Friction and heat transfer coefficients are presented in a large velocity range in the report. The goals are reached acceptably using numerical and experimental research. The next challenge is to achieve results for grooved stator-rotor combinations. The work contains also results for an air gap with a grooved stator with 36 slots. The velocity field by the numerical method does not match in every respect the estimated flow mode. The absence of secondary Taylor vortices is evident when using time averagednumerical simulation.