872 resultados para Online and off-line diagnosis and monitoring methods
Resumo:
Työ sisältää ohjaislaitteiston vertailun ja valinnan rinnakkaisrakenteista robottia varten sekä kunnonvalvontajärjestelmän periaatteiden laadinnan kyseistä robottia varten. Ohjauslaitteisto sisältää teollisuustietokoneen sekä kenttäväylän. Sekä tietokoneesta että väylästä on teoriaosuus ja yksityiskohtaisempi valintaosuus. Teoriaosuudessa selitetään tarkemmin laitteiden toimintaperiaatteista. Valintaosuudessa kerrotaanmiksi jokin tietty laite on valittu käytettäväksi robotin ohjauksessa. Kunnonvalvontateoria ja rinnakkaisrakenteisen robotin kunnonvalvonnan keinot ovat työn toinen osa. Teoriaosa sisältää yleisluonteisen selvityksen vikaantumisesta ja valvonnasta. Erikoisrobotin kunnonvalvonnan keinot esitetään työssä tietyssä järjestyksessä. Ensin esitetään mahdolliset vikatilanteet. Toisessa kohdassa havainnollistetaan vikojen havaitseminen.
Resumo:
Työn tarkoituksena oli suunnitella kunnonvalvontajärjestelmä kahdelle lasivillan tuotantolinjalle. Suunnitteluprosessin lisäksi työssä on esitelty erilaisia kunnonvalvontamenetelmiä. Työn alussa on kerrottu erilaisista kunnonvalvontamenetelmistä, joilla voidaan seurata erilaisten laitteiden ja koneiden toimintakuntoa.Erityisesti työssä on tarkasteltu teollisuudessa yleistyviä kunnonvalvonnan värähtelymittauksia. Työssä suunniteltu kunnonvalvontajärjestelmä perustuu viiteen eri menetelmään, jotka ovat värähtelymittaus, lämpötilanmittaus lämpökameralla, lämpötilanmittaus kannettavalla mittarilla, kuuntelu elektronisella stetoskoopilla ja pyörivien osien kunnontarkkailu stroboskoopilla. Kunnonvalvontajärjestelmän suunnittelu on tehty useassa eri vaiheessa. Ensin työssä on kartoitettu tuotannon kannalta tärkeimmät laitteet ja niiden mahdolliset vikaantumistavat. Seuraavaksi on valittu sopivat kunnonvalvontamenetelmät ja tehty mittaussuunnitelma, jossa on esitetty eri laitteille suoritettavat mittaukset ja mittausten aikavälit.Lopuksi työssä on esitelty muutama esimerkkitapaus kunnonvalvontamenetelmien käytöstä sekä kerrottu mahdollisista tulevaisuuden kehitysmahdollisuuksista.
Resumo:
Forensic intelligence is a distinct dimension of forensic science. Forensic intelligence processes have mostly been developed to address either a specific type of trace or a specific problem. Even though these empirical developments have led to successes, they are trace-specific in nature and contribute to the generation of silos which hamper the establishment of a more general and transversal model. Forensic intelligence has shown some important perspectives but more general developments are required to address persistent challenges. This will ensure the progress of the discipline as well as its widespread implementation in the future. This paper demonstrates that the description of forensic intelligence processes, their architectures, and the methods for building them can, at a certain level, be abstracted from the type of traces considered. A comparative analysis is made between two forensic intelligence approaches developed independently in Australia and in Europe regarding the monitoring of apparently very different kind of problems: illicit drugs and false identity documents. An inductive effort is pursued to identify similarities and to outline a general model. Besides breaking barriers between apparently separate fields of study in forensic science and intelligence, this transversal model would assist in defining forensic intelligence, its role and place in policing, and in identifying its contributions and limitations. The model will facilitate the paradigm shift from the current case-by-case reactive attitude towards a proactive approach by serving as a guideline for the use of forensic case data in an intelligence-led perspective. A follow-up article will specifically address issues related to comparison processes, decision points and organisational issues regarding forensic intelligence (part II).
Resumo:
Thedirect torque control (DTC) has become an accepted vector control method besidethe current vector control. The DTC was first applied to asynchronous machines,and has later been applied also to synchronous machines. This thesis analyses the application of the DTC to permanent magnet synchronous machines (PMSM). In order to take the full advantage of the DTC, the PMSM has to be properly dimensioned. Therefore the effect of the motor parameters is analysed taking the control principle into account. Based on the analysis, a parameter selection procedure is presented. The analysis and the selection procedure utilize nonlinear optimization methods. The key element of a direct torque controlled drive is the estimation of the stator flux linkage. Different estimation methods - a combination of current and voltage models and improved integration methods - are analysed. The effect of an incorrect measured rotor angle in the current model is analysed andan error detection and compensation method is presented. The dynamic performance of an earlier presented sensorless flux estimation method is made better by improving the dynamic performance of the low-pass filter used and by adapting the correction of the flux linkage to torque changes. A method for the estimation ofthe initial angle of the rotor is presented. The method is based on measuring the inductance of the machine in several directions and fitting the measurements into a model. The model is nonlinear with respect to the rotor angle and therefore a nonlinear least squares optimization method is needed in the procedure. A commonly used current vector control scheme is the minimum current control. In the DTC the stator flux linkage reference is usually kept constant. Achieving the minimum current requires the control of the reference. An on-line method to perform the minimization of the current by controlling the stator flux linkage reference is presented. Also, the control of the reference above the base speed is considered. A new estimation flux linkage is introduced for the estimation of the parameters of the machine model. In order to utilize the flux linkage estimates in off-line parameter estimation, the integration methods are improved. An adaptive correction is used in the same way as in the estimation of the controller stator flux linkage. The presented parameter estimation methods are then used in aself-commissioning scheme. The proposed methods are tested with a laboratory drive, which consists of a commercial inverter hardware with a modified software and several prototype PMSMs.
Resumo:
Eosinophilic fasciitis is a rare condition. It is generally limited to the distal parts of the arms and legs. MRI is the ideal imaging modality for diagnosing and monitoring this condition. MRI findings typically evidence only fascial involvement but on a less regular basis signal abnormalities may be observed in neighboring muscle tissue and hypodermic fat. Differential diagnosis of eosinophilic fasciitis by MRI requires the exclusion of several other superficial and deep soft tissue disorders.
Resumo:
BACKGROUND & AIMS: Trace elements (TE) are involved in the immune and antioxidant defences which are of particular importance during critical illness. Determining plasma TE levels is costly. The present quality control study aimed at assessing the economic impact of a computer reminded blood sampling versus a risk guided on-demand monitoring of plasma concentrations of selenium, copper, and zinc. METHODS: Retrospective analysis of 2 cohorts of patients admitted during 6 months periods in 2006 and 2009 to the ICU of a University hospital. INCLUSION CRITERIA: to receive intravenous micronutrient supplements and/or to have a TE sampling during ICU stay. The TE samplings were triggered by computerized reminder in 2006 versus guided by nutritionists in 2009. RESULTS: During the 2 periods 636 patients met the inclusion criteria out of 2406 consecutive admissions, representing 29.7% and 24.9% respectively of the periods' admissions. The 2009 patients had higher SAPS2 scores (p = 0.02) and lower BMI compared to 2006 (p = 0.007). The number of laboratory determinations was drastically reduced in 2009, particularly during the first week, despite the higher severity of the cohort, resulting in à 55% cost reduction. CONCLUSIONS: The monitoring of TE concentrations guided by a nutritionist resulted in a reduction of the sampling frequency, and targeting on the sickest high risk patients, requiring a nutritional prescription adaptation. This control leads to cost reduction compared to an automated sampling prescription.
Resumo:
OBJECTIVE: To develop disease-specific recommendations for the diagnosis and management of eosinophilic granulomatosis with polyangiitis (Churg-Strauss syndrome) (EGPA). METHODS: The EGPA Consensus Task Force experts comprised 8 pulmonologists, 6 internists, 4 rheumatologists, 3 nephrologists, 1 pathologist and 1 allergist from 5 European countries and the USA. Using a modified Delphi process, a list of 40 questions was elaborated by 2 members and sent to all participants prior to the meeting. Concurrently, an extensive literature search was undertaken with publications assigned with a level of evidence according to accepted criteria. Drafts of the recommendations were circulated for review to all members until final consensus was reached. RESULTS: Twenty-two recommendations concerning the diagnosis, initial evaluation, treatment and monitoring of EGPA patients were established. The relevant published information on EGPA, antineutrophil-cytoplasm antibody-associated vasculitides, hypereosinophilic syndromes and eosinophilic asthma supporting these recommendations was also reviewed. DISCUSSION: These recommendations aim to give physicians tools for effective and individual management of EGPA patients, and to provide guidance for further targeted research.
Resumo:
Background: The DNA repair protein O6-Methylguanine-DNA methyltransferase (MGMT) confers resistance to alkylating agents. Several methods have been applied to its analysis, with methylation-specific polymerase chain reaction (MSP) the most commonly used for promoter methylation study, while immunohistochemistry (IHC) has become the most frequently used for the detection of MGMT protein expression. Agreement on the best and most reliable technique for evaluating MGMT status remains unsettled. The aim of this study was to perform a systematic review and meta-analysis of the correlation between IHC and MSP. Methods A computer-aided search of MEDLINE (1950-October 2009), EBSCO (1966-October 2009) and EMBASE (1974-October 2009) was performed for relevant publications. Studies meeting inclusion criteria were those comparing MGMT protein expression by IHC with MGMT promoter methylation by MSP in the same cohort of patients. Methodological quality was assessed by using the QUADAS and STARD instruments. Previously published guidelines were followed for meta-analysis performance. Results Of 254 studies identified as eligible for full-text review, 52 (20.5%) met the inclusion criteria. The review showed that results of MGMT protein expression by IHC are not in close agreement with those obtained with MSP. Moreover, type of tumour (primary brain tumour vs others) was an independent covariate of accuracy estimates in the meta-regression analysis beyond the cut-off value. Conclusions Protein expression assessed by IHC alone fails to reflect the promoter methylation status of MGMT. Thus, in attempts at clinical diagnosis the two methods seem to select different groups of patients and should not be used interchangeably.
Resumo:
Rosin is a natural product from pine forests and it is used as a raw material in resinate syntheses. Resinates are polyvalent metal salts of rosin acids and especially Ca- and Ca/Mg- resinates find wide application in the printing ink industry. In this thesis, analytical methods were applied to increase general knowledge of resinate chemistry and the reaction kinetics was studied in order to model the non linear solution viscosity increase during resinate syntheses by the fusion method. Solution viscosity in toluene is an important quality factor for resinates to be used in printing inks. The concept of critical resinate concentration, c crit, was introduced to define an abrupt change in viscosity dependence on resinate concentration in the solution. The concept was then used to explain the non-inear solution viscosity increase during resinate syntheses. A semi empirical model with two estimated parameters was derived for the viscosity increase on the basis of apparent reaction kinetics. The model was used to control the viscosity and to predict the total reaction time of the resinate process. The kinetic data from the complex reaction media was obtained by acid value titration and by FTIR spectroscopic analyses using a conventional calibration method to measure the resinate concentration and the concentration of free rosin acids. A multivariate calibration method was successfully applied to make partial least square (PLS) models for monitoring acid value and solution viscosity in both mid-infrared (MIR) and near infrared (NIR) regions during the syntheses. The calibration models can be used for on line resinate process monitoring. In kinetic studies, two main reaction steps were observed during the syntheses. First a fast irreversible resination reaction occurs at 235 °C and then a slow thermal decarboxylation of rosin acids starts to take place at 265 °C. Rosin oil is formed during the decarboxylation reaction step causing significant mass loss as the rosin oil evaporates from the system while the viscosity increases to the target level. The mass balance of the syntheses was determined based on the resinate concentration increase during the decarboxylation reaction step. A mechanistic study of the decarboxylation reaction was based on the observation that resinate molecules are partly solvated by rosin acids during the syntheses. Different decarboxylation mechanisms were proposed for the free and solvating rosin acids. The deduced kinetic model supported the analytical data of the syntheses in a wide resinate concentration region, over a wide range of viscosity values and at different reaction temperatures. In addition, the application of the kinetic model to the modified resinate syntheses gave a good fit. A novel synthesis method with the addition of decarboxylated rosin (i.e. rosin oil) to the reaction mixture was introduced. The conversion of rosin acid to resinate was increased to the level necessary to obtain the target viscosity for the product at 235 °C. Due to a lower reaction temperature than in traditional fusion synthesis at 265 °C, thermal decarboxylation is avoided. As a consequence, the mass yield of the resinate syntheses can be increased from ca. 70% to almost 100% by recycling the added rosin oil.
Resumo:
Aims:This study was carried out to evaluate the feasibility of two different methods to determine free flap perfusion in cancer patients undergoing major reconstructive surgery. The hypotheses was that low perfusion in the flap is associated with flap complications. Patients and methods: Between August 2002 and June 2008 at the Department of Otorhinolaryngology – Head and Neck Surgery, Department of Surgery, and at the PET Centre, Turku, 30 consecutive patients with 32 free flaps were included in this study. The perfusion of the free microvascular flaps was assessed with positron emission tomography (PET) and radioactive water ([15O] H2O) in 40 radiowater injections in 33 PET studies. Furthermore, 24 free flaps were monitored with a continuous tissue oxygen measurement using flexible polarographic catheters for an average of three postoperative days. Results: Of the 17 patients operated on for head and neck (HN) cancer and reconstructed with 18 free flaps, three re-operations were carried out due to poor tissue oxygenation as indicated by ptiO2 monitoring results and three other patients were reoperated on for postoperative hematomas in the operated area. Blood perfusion assessed with PET (BFPET) was above 2.0 mL / min / 100 g in all flaps and a low flap-to-muscle BFPET ratio appeared to correlate with poor survival of the flap. Survival in this group of HN cancer patients was 9.0 months (median, range 2.4-34.2) after a median follow-up of 11.9 months (range 1.0-61.0 months). Seven HN patients of this group are alive without any sign of recurrence and one patient has died of other causes. All of the 13 breast reconstruction patients included in the study are alive and free of disease at a median follow-up time of 27.4 months (range 13.9-35.7 months). Re-explorations were carried out in three patients due data provided by ptiO2 monitoring and one re-exploration was avoided on the basis of adequate blood perfusion assessed with PET. Two patients had donorsite morbidity and 3 patients had partial flap necrosis or fat necrosis. There were no total flap losses. Conclusions: PtiO2 monitoring is a feasible method of free flap monitoring when flap temperature is monitored and maintained close to the core temperature. When other monitoring methods give controversial results or are unavailable, [15O] H2O PET technique is feasible in the evaluation of the perfusion of the newly reconstructed free flaps.
Resumo:
Fluid handling systems such as pump and fan systems are found to have a significant potential for energy efficiency improvements. To deliver the energy saving potential, there is a need for easily implementable methods to monitor the system output. This is because information is needed to identify inefficient operation of the fluid handling system and to control the output of the pumping system according to process needs. Model-based pump or fan monitoring methods implemented in variable speed drives have proven to be able to give information on the system output without additional metering; however, the current model-based methods may not be usable or sufficiently accurate in the whole operation range of the fluid handling device. To apply model-based system monitoring in a wider selection of systems and to improve the accuracy of the monitoring, this paper proposes a new method for pump and fan output monitoring with variable-speed drives. The method uses a combination of already known operating point estimation methods. Laboratory measurements are used to verify the benefits and applicability of the improved estimation method, and the new method is compared with five previously introduced model-based estimation methods. According to the laboratory measurements, the new estimation method is the most accurate and reliable of the model-based estimation methods.
Resumo:
BK polyomavirus (BKPyV) is a causal agent of nephropathy, ureteral stenosis and hemorrhagic cystitis in kidney transplant recipients, and is considered an important emerging disease in transplantation. Regular screening for BKPyV reactivation mainly during the first 2 years posttransplant, with subsequent pre-emptive reduction of immunosuppression is considered the best option to avoid disease progression, since successful clearance or reduction of viremia is achieved in the vast majority of patients within 6 months. The use of drugs with antiviral properties for patients with persistent viremia has been attempted despite unclear benefits. Clinical manifestations of BKPyV nephropathy, current strategies for diagnosis and monitoring of BKPyV infection, management of immunosuppressive regimen after detection of BKPyV reactivation and the use of antiviral drugs are discussed in this review.
Resumo:
The general aim of the thesis was to study university students’ learning from the perspective of regulation of learning and text processing. The data were collected from the two academic disciplines of medical and teacher education, which share the features of highly scheduled study, a multidisciplinary character, a complex relationship between theory and practice and a professional nature. Contemporary information society poses new challenges for learning, as it is not possible to learn all the information needed in a profession during a study programme. Therefore, it is increasingly important to learn how to think and learn independently, how to recognise gaps in and update one’s knowledge and how to deal with the huge amount of constantly changing information. In other words, it is critical to regulate one’s learning and to process text effectively. The thesis comprises five sub-studies that employed cross-sectional, longitudinal and experimental designs and multiple methods, from surveys to eye tracking. Study I examined the connections between students’ study orientations and the ways they regulate their learning. In total, 410 second-, fourth- and sixth-year medical students from two Finnish medical schools participated in the study by completing a questionnaire measuring both general study orientations and regulation strategies. The students were generally deeply oriented towards their studies. However, they regulated their studying externally. Several interesting and theoretically reasonable connections between the variables were found. For instance, self-regulation was positively correlated with deep orientation and achievement orientation and was negatively correlated with non-commitment. However, external regulation was likewise positively correlated with deep orientation and achievement orientation but also with surface orientation and systematic orientation. It is argued that external regulation might function as an effective coping strategy in the cognitively loaded medical curriculum. Study II focused on medical students’ regulation of learning and their conceptions of the learning environment in an innovative medical course where traditional lectures were combined wth problem-based learning (PBL) group work. First-year medical and dental students (N = 153) completed a questionnaire assessing their regulation strategies of learning and views about the PBL group work. The results indicated that external regulation and self-regulation of the learning content were the most typical regulation strategies among the participants. In line with previous studies, self-regulation wasconnected with study success. Strictly organised PBL sessions were not considered as useful as lectures, although the students’ views of the teacher/tutor and the group were mainly positive. Therefore, developers of teaching methods are challenged to think of new solutions that facilitate reflection of one’s learning and that improve the development of self-regulation. In Study III, a person-centred approach to studying regulation strategies was employed, in contrast to the traditional variable-centred approach used in Study I and Study II. The aim of Study III was to identify different regulation strategy profiles among medical students (N = 162) across time and to examine to what extent these profiles predict study success in preclinical studies. Four regulation strategy profiles were identified, and connections with study success were found. Students with the lowest self-regulation and with an increasing lack of regulation performed worse than the other groups. As the person-centred approach enables us to individualise students with diverse regulation patterns, it could be used in supporting student learning and in facilitating the early diagnosis of learning difficulties. In Study IV, 91 student teachers participated in a pre-test/post-test design where they answered open-ended questions about a complex science concept both before and after reading either a traditional, expository science text or a refutational text that prompted the reader to change his/her beliefs according to scientific beliefs about the phenomenon. The student teachers completed a questionnaire concerning their regulation and processing strategies. The results showed that the students’ understanding improved after text reading intervention and that refutational text promoted understanding better than the traditional text. Additionally, regulation and processing strategies were found to be connected with understanding the science phenomenon. A weak trend showed that weaker learners would benefit more from the refutational text. It seems that learners with effective learning strategies are able to pick out the relevant content regardless of the text type, whereas weaker learners might benefit from refutational parts that contrast the most typical misconceptions with scientific views. The purpose of Study V was to use eye tracking to determine how third-year medical studets (n = 39) and internal medicine residents (n = 13) read and solve patient case texts. The results revealed differences between medical students and residents in processing patient case texts; compared to the students, the residents were more accurate in their diagnoses and processed the texts significantly faster and with a lower number of fixations. Different reading patterns were also found. The observed differences between medical students and residents in processing patient case texts could be used in medical education to model expert reasoning and to teach how a good medical text should be constructed. The main findings of the thesis indicate that even among very selected student populations, such as high-achieving medical students or student teachers, there seems to be a lot of variation in regulation strategies of learning and text processing. As these learning strategies are related to successful studying, students enter educational programmes with rather different chances of managing and achieving success. Further, the ways of engaging in learning seldom centre on a single strategy or approach; rather, students seem to combine several strategies to a certain degree. Sometimes, it can be a matter of perspective of which way of learning can be considered best; therefore, the reality of studying in higher education is often more complicated than the simplistic view of self-regulation as a good quality and external regulation as a harmful quality. The beginning of university studies may be stressful for many, as the gap between high school and university studies is huge and those strategies that were adequate during high school might not work as well in higher education. Therefore, it is important to map students’ learning strategies and to encourage them to engage in using high-quality learning strategies from the beginning. Instead of separate courses on learning skills, the integration of these skills into course contents should be considered. Furthermore, learning complex scientific phenomena could be facilitated by paying attention to high-quality learning materials and texts and other support from the learning environment also in the university. Eye tracking seems to have great potential in evaluating performance and growing diagnostic expertise in text processing, although more research using texts as stimulus is needed. Both medical and teacher education programmes and the professions themselves are challenging in terms of their multidisciplinary nature and increasing amounts of information and therefore require good lifelong learning skills during the study period and later in work life.
Resumo:
Currently, laser scribing is growing material processing method in the industry. Benefits of laser scribing technology are studied for example for improving an efficiency of solar cells. Due high-quality requirement of the fast scribing process, it is important to monitor the process in real time for detecting possible defects during the process. However, there is a lack of studies of laser scribing real time monitoring. Commonly used monitoring methods developed for other laser processes such a laser welding, are sufficient slow and existed applications cannot be implemented in fast laser scribing monitoring. The aim of this thesis is to find a method for laser scribing monitoring with a high-speed camera and evaluate reliability and performance of the developed monitoring system with experiments. The laser used in experiments is an IPG ytterbium pulsed fiber laser with 20 W maximum average power and Scan head optics used in the laser is Scanlab’s Hurryscan 14 II with an f100 tele-centric lens. The camera was connected to laser scanner using camera adapter to follow the laser process. A powerful fully programmable industrial computer was chosen for executing image processing and analysis. Algorithms for defect analysis, which are based on particle analysis, were developed using LabVIEW system design software. The performance of the algorithms was analyzed by analyzing a non-moving image from the scribing line with resolution 960x20 pixel. As a result, the maximum analysis speed was 560 frames per second. Reliability of the algorithm was evaluated by imaging scribing path with a variable number of defects 2000 mm/s when the laser was turned off and image analysis speed was 430 frames per second. The experiment was successful and as a result, the algorithms detected all defects from the scribing path. The final monitoring experiment was performed during a laser process. However, it was challenging to get active laser illumination work with the laser scanner due physical dimensions of the laser lens and the scanner. For reliable error detection, the illumination system is needed to be replaced.
Resumo:
Image processing has been a challenging and multidisciplinary research area since decades with continuing improvements in its various branches especially Medical Imaging. The healthcare industry was very much benefited with the advances in Image Processing techniques for the efficient management of large volumes of clinical data. The popularity and growth of Image Processing field attracts researchers from many disciplines including Computer Science and Medical Science due to its applicability to the real world. In the meantime, Computer Science is becoming an important driving force for the further development of Medical Sciences. The objective of this study is to make use of the basic concepts in Medical Image Processing and develop methods and tools for clinicians’ assistance. This work is motivated from clinical applications of digital mammograms and placental sonograms, and uses real medical images for proposing a method intended to assist radiologists in the diagnostic process. The study consists of two domains of Pattern recognition, Classification and Content Based Retrieval. Mammogram images of breast cancer patients and placental images are used for this study. Cancer is a disaster to human race. The accuracy in characterizing images using simplified user friendly Computer Aided Diagnosis techniques helps radiologists in detecting cancers at an early stage. Breast cancer which accounts for the major cause of cancer death in women can be fully cured if detected at an early stage. Studies relating to placental characteristics and abnormalities are important in foetal monitoring. The diagnostic variability in sonographic examination of placenta can be overlooked by detailed placental texture analysis by focusing on placental grading. The work aims on early breast cancer detection and placental maturity analysis. This dissertation is a stepping stone in combing various application domains of healthcare and technology.