948 resultados para parallel robots,cable driven,underactuated,calibration,sensitivity,accuracy


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Contemporary coronary magnetic resonance angiography techniques suffer from signal-to-noise ratio (SNR) constraints. We propose a method to enhance SNR in gradient echo coronary magnetic resonance angiography by using sensitivity encoding (SENSE). While the use of sensitivity encoding to improve SNR seems counterintuitive, it can be exploited by reducing the number of radiofrequency excitations during the acquisition window while lowering the signal readout bandwidth, therefore improving the radiofrequency receive to radiofrequency transmit duty cycle. Under certain conditions, this leads to improved SNR. The use of sensitivity encoding for improved SNR in three-dimensional coronary magnetic resonance angiography is investigated using numerical simulations and an in vitro and an in vivo study. A maximum 55% SNR enhancement for coronary magnetic resonance angiography was found both in vitro and in vivo, which is well consistent with the numerical simulations. This method is most suitable for spoiled gradient echo coronary magnetic resonance angiography in which a high temporal and spatial resolution is required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays the used fuel variety in power boilers is widening and new boiler constructions and running models have to be developed. This research and development is done in small pilot plants where more faster analyse about the boiler mass and heat balance is needed to be able to find and do the right decisions already during the test run. The barrier on determining boiler balance during test runs is the long process of chemical analyses of collected input and outputmatter samples. The present work is concentrating on finding a way to determinethe boiler balance without chemical analyses and optimise the test rig to get the best possible accuracy for heat and mass balance of the boiler. The purpose of this work was to create an automatic boiler balance calculation method for 4 MW CFB/BFB pilot boiler of Kvaerner Pulping Oy located in Messukylä in Tampere. The calculation was created in the data management computer of pilot plants automation system. The calculation is made in Microsoft Excel environment, which gives a good base and functions for handling large databases and calculations without any delicate programming. The automation system in pilot plant was reconstructed und updated by Metso Automation Oy during year 2001 and the new system MetsoDNA has good data management properties, which is necessary for big calculations as boiler balance calculation. Two possible methods for calculating boiler balance during test run were found. Either the fuel flow is determined, which is usedto calculate the boiler's mass balance, or the unburned carbon loss is estimated and the mass balance of the boiler is calculated on the basis of boiler's heat balance. Both of the methods have their own weaknesses, so they were constructed parallel in the calculation and the decision of the used method was left to user. User also needs to define the used fuels and some solid mass flowsthat aren't measured automatically by the automation system. With sensitivity analysis was found that the most essential values for accurate boiler balance determination are flue gas oxygen content, the boiler's measured heat output and lower heating value of the fuel. The theoretical part of this work concentrates in the error management of these measurements and analyses and on measurement accuracy and boiler balance calculation in theory. The empirical part of this work concentrates on the creation of the balance calculation for the boiler in issue and on describing the work environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: to assess the diagnostic accuracy of different anthropometric markers in defining low aerobic fitness among adolescents. Methods: cross-sectional study on 2,331 boys and 2,366 girls aged 10 - 18 years. Body mass index (BMI) was measured using standardized methods; body fat (BF) was assessed by bioelectrical impedance. Low aerobic fitness was assessed by the 20-meter shuttle run using the FITNESSGRAMR criteria. Waist was measured in a subsample of 1,933 boys and 1,897 girls. Overweight, obesity and excess fat were defined according to the International Obesity Task Force (IOTF) or FITNESSGRAMR criteria. Results: 38.5% of boys and 46.5% of girls were considered as unfit according to the FITNESSGRAMR criteria. In boys, the area under the ROC curve (AUC) and 95% confidence interval were 66.7 (64.1 - 69.3), 67.1 (64.5 - 69.6) and 64.6 (61.9 - 67.2) for BMI, BF and waist, respectively (P<0.02). In girls, the values were 68.3 (65.9 - 70.8), 63.8 (61.3 - 66.3) and 65.9 (63.4 - 68.4), respectively (P<0.001). In boys, the sensitivity and specificity to diagnose low fitness were 13% and 99% for obesity (IOTF); 38% and 86% for overweight + obesity (IOTF); 28% and 94% for obesity (FITNESSGRAMR) and 42% and 81% for excess fat (FITNESSGRAMR). For girls, the values were 9% and 99% for obesity (IOTF); 33% and 82% for overweight + obesity (IOTF); 22% and 94% for obesity (FITNESSGRAMR) and 26% and 90% for excess fat (FITNESSGRAMR). Conclusions: BMI, not body fat or waist, should be used to define low aerobic fitness. The IOTF BMI cut-points to define obesity have a very low screening capacity and should not be used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software engineering is criticized as not being engineering or 'well-developed' science at all. Software engineers seem not to know exactly how long their projects will last, what they will cost, and will the software work properly after release. Measurements have to be taken in software projects to improve this situation. It is of limited use to only collect metrics afterwards. The values of the relevant metrics have to be predicted, too. The predictions (i.e. estimates) form the basis for proper project management. One of the most painful problems in software projects is effort estimation. It has a clear and central effect on other project attributes like cost and schedule, and to product attributes like size and quality. Effort estimation can be used for several purposes. In this thesis only the effort estimation in software projects for project management purposes is discussed. There is a short introduction to the measurement issues, and some metrics relevantin estimation context are presented. Effort estimation methods are covered quite broadly. The main new contribution in this thesis is the new estimation model that has been created. It takes use of the basic concepts of Function Point Analysis, but avoids the problems and pitfalls found in the method. It is relativelyeasy to use and learn. Effort estimation accuracy has significantly improved after taking this model into use. A major innovation related to the new estimationmodel is the identified need for hierarchical software size measurement. The author of this thesis has developed a three level solution for the estimation model. All currently used size metrics are static in nature, but this new proposed metric is dynamic. It takes use of the increased understanding of the nature of the work as specification and design work proceeds. It thus 'grows up' along with software projects. The effort estimation model development is not possible without gathering and analyzing history data. However, there are many problems with data in software engineering. A major roadblock is the amount and quality of data available. This thesis shows some useful techniques that have been successful in gathering and analyzing the data needed. An estimation process is needed to ensure that methods are used in a proper way, estimates are stored, reported and analyzed properly, and they are used for project management activities. A higher mechanism called measurement framework is also introduced shortly. The purpose of the framework is to define and maintain a measurement or estimationprocess. Without a proper framework, the estimation capability of an organization declines. It requires effort even to maintain an achieved level of estimationaccuracy. Estimation results in several successive releases are analyzed. It isclearly seen that the new estimation model works and the estimation improvementactions have been successful. The calibration of the hierarchical model is a critical activity. An example is shown to shed more light on the calibration and the model itself. There are also remarks about the sensitivity of the model. Finally, an example of usage is shown.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Occupational exposure modeling is widely used in the context of the E.U. regulation on the registration, evaluation, authorization, and restriction of chemicals (REACH). First tier tools, such as European Centre for Ecotoxicology and TOxicology of Chemicals (ECETOC) targeted risk assessment (TRA) or Stoffenmanager, are used to screen a wide range of substances. Those of concern are investigated further using second tier tools, e.g., Advanced REACH Tool (ART). Local sensitivity analysis (SA) methods are used here to determine dominant factors for three models commonly used within the REACH framework: ECETOC TRA v3, Stoffenmanager 4.5, and ART 1.5. Based on the results of the SA, the robustness of the models is assessed. For ECETOC, the process category (PROC) is the most important factor. A failure to identify the correct PROC has severe consequences for the exposure estimate. Stoffenmanager is the most balanced model and decision making uncertainties in one modifying factor are less severe in Stoffenmanager. ART requires a careful evaluation of the decisions in the source compartment since it constitutes ∼75% of the total exposure range, which corresponds to an exposure estimate of 20-22 orders of magnitude. Our results indicate that there is a trade off between accuracy and precision of the models. Previous studies suggested that ART may lead to more accurate results in well-documented exposure situations. However, the choice of the adequate model should ultimately be determined by the quality of the available exposure data: if the practitioner is uncertain concerning two or more decisions in the entry parameters, Stoffenmanager may be more robust than ART.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Työn tavoitteena oli vaaitusjärjestelmän kehittäminen ITO 2005 Asrad-R –ohjusjärjestelmän laukaisualustalle. Järjestelmäkontin jokaiseen kulmaan suunniteltiin nostojalat, joita voitiin kääntää sekä nostaa ja laskea hydraulisesti. Järjestelmään kuului myös hydraulikäyttöinen asetaso, jonka ylä- ja ala-asennoilla oli erillinen lukitus. Tärkein vaadittu toiminto oli järjestelmäkontin vaaitus maan vetovoiman suhteen epätasaisella alustalla sekä maastossa että kuorma-auton päällä. Lisäksi tarvittiin nostojalkojen nostoliikkeelle toiminto, jolla konttia nostetaan ja lasketaan siten, että se säilyttää lähtökulmansa. Kaikkia nostojalkojen sylintereitä oli mahdollista käyttää myös yksittäin. Järjestelmällä oli kovat ympäristövaatimukset, koska se oli suunniteltu sotalaitteeksi rauhan ja kriisin aikaista toimintaa varten. Riittävät vapausasteet järjestelmän toimimiseksi ajoneuvon apurungon päällä saatiin nostojalkojen rotaatioliikkeen joustosta ja tassujen liukumisesta poikittain apurungon kiinnikkeissä. Maastossa nostojalat olivat täysin jäykkiä ja liukuminen tapahtui tassujen ja maalevyjen välillä. Hydraulijärjestelmässä käytettiin on/off-magneettiventtiilejä, koska liikkeet olivat hitaita ja vaadittuun tarkkuuteen ja nopeuteen päästiin helposti. Keskeisiä suunnittelun lähtökohtia olivat rakenteiden keveys, järjestelmän yksinkertaisuus, hinta ja toimintavarmuus. Yksinkertaisella ja edullisella järjestelmällä saatiin aikaan hyvin toimiva ratkaisu.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: Immunohistochemistry (IHC) has become a promising method for pre-screening ALK-rearrangements in non-small cell lung carcinomas (NSCLC). Various ALK antibodies, detection systems and automated immunostainers are available. We therefore aimed to compare the performance of the monoclonal 5A4 (Novocastra, Leica) and D5F3 (Cell Signaling, Ventana) antibodies using two different immunostainers. Additionally we analyzed the accuracy of prospective ALK IHC-testing in routine diagnostics. MATERIALS AND METHODS: Seventy-two NSCLC with available ALK FISH results and enriched for FISH-positive carcinomas were retrospectively analyzed. IHC was performed on BenchMarkXT (Ventana) using 5A4 and D5F3, respectively, and additionally with 5A4 on Bond-MAX (Leica). Data from our routine diagnostics on prospective ALK-testing with parallel IHC, using 5A4, and FISH were available from 303 NSCLC. RESULTS: All three IHC protocols showed congruent results. Only 1/25 FISH-positive NSCLC (4%) was false negative by IHC. For all three IHC protocols the sensitivity, specificity, positive (PPV) and negative predictive values (NPV) compared to FISH were 96%, 100%, 100% and 97.8%, respectively. In the prospective cohort 3/32 FISH-positive (9.4%) and 2/271 FISH-negative (0.7%) NSCLC were false negative and false positive by IHC, respectively. In routine diagnostics the sensitivity, specificity, PPV and NPV of IHC compared to FISH were 90.6%, 99.3%, 93.5% and 98.9%, respectively. CONCLUSIONS: 5A4 and D5F3 are equally well suited for detecting ALK-rearranged NSCLC. BenchMark and BOND-MAX immunostainers can be used for IHC with 5A4. True discrepancies between IHC and FISH results do exist and need to be addressed when implementing IHC in an ALK-testing algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tutkielman tavoitteena oli tarkastella innovaatioiden leviämismallien ennustetarkkuuteen vaikuttavia tekijöitä. Tutkielmassa ennustettiin logistisella mallilla matkapuhelinliittymien leviämistä kolmessa Euroopan maassa: Suomessa, Ranskassa ja Kreikassa. Teoriaosa keskittyi innovaatioiden leviämisen ennustamiseen leviämismallien avulla. Erityisesti painotettiin mallien ennustuskykyä ja niiden käytettävyyttä eri tilanteissa. Empiirisessä osassa keskityttiin ennustamiseen logistisella leviämismallilla, joka kalibroitiin eri tavoin koostetuilla aikasarjoilla. Näin tehtyjä ennusteita tarkasteltiin tiedon kokoamistasojen vaikutusten selvittämiseksi. Tutkimusasetelma oli empiirinen, mikä sisälsi logistisen leviämismallin ennustetarkkuuden tutkimista otosdatan kokoamistasoa muunnellen. Leviämismalliin syötettävä data voidaan kerätä kuukausittain ja operaattorikohtaisesti vaikuttamatta ennustetarkkuuteen. Dataan on sisällytettävä leviämiskäyrän käännöskohta, eli pitkän aikavälin huippukysyntäpiste.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: The aim of this study was to systematically compare a comprehensive array of magnetic resonance (MR) imaging features in terms of their sensitivity and specificity to diagnose cervical lymph node metastases in patients with thyroid cancer. MATERIALS AND METHODS: The study included 41 patients with thyroid malignancy who underwent surgical excision of cervical lymph nodes and had preoperative MR imaging ≤4weeks prior to surgery. Three head and neck neuroradiologists independently evaluated all the MR images. Using the pathology results as reference, the sensitivity, specificity and interobserver agreement of each MR imaging characteristic were calculated. RESULTS: On multivariate analysis, no single imaging feature was significantly correlated with metastasis. In general, imaging features demonstrated high specificity, but poor sensitivity and moderate interobserver agreement at best. CONCLUSIONS: Commonly used MR imaging features have limited sensitivity at correctly identifying cervical lymph node metastases in patients with thyroid cancer. A negative neck MR scan should not dissuade a surgeon from performing a neck dissection in patients with thyroid carcinomas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

(Matrix-assisted) laser desorption/ionization ((MA)LDI) mass spectrometry imaging (MSI) has been driven by remarkable technological developments in the last couple of years. Although molecular information of a wide range of molecules including peptides, lipids, metabolites, and xenobiotics can be mapped, (MA)LDI MSI only leads to the detection of the most abundant soluble molecules in the cells and, consequently, does not provide access to the least expressed species, which can be very informative in the scope of disease research. Within a short period of time, numerous protocols and concepts have been developed and introduced in order to increase MSI sensitivity, including in situ tissue chemistry and solvent-free matrix depositions. In this chapter, we will discuss some of the latest developments in the field of high-sensitivity MSI using solvent-free matrix depositions and will detail protocols of two methods with their capability of enriching molecular MSI signal as demonstrated within our laboratory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents briefly the basic operation and use of centrifugal pumps and parallel pumping applications. The characteristics of parallel pumping applications are compared to circuitry, in order to search analogy between these technical fields. The purpose of studying circuitry is to find out if common software tools for solving circuit performance could be used to observe parallel pumping applications. The empirical part of the thesis introduces a simulation environment for parallel pumping systems, which is based on circuit components of Matlab Simulink —software. The created simulation environment ensures the observation of variable speed controlled parallel pumping systems in case of different controlling methods. The introduced simulation environment was evaluated by building a simulation model for actual parallel pumping system at Lappeenranta University of Technology. The simulated performance of the parallel pumps was compared to measured values of the actual system. The gathered information shows, that if the initial data of the system and pump perfonnance is adequate, the circuitry based simulation environment can be exploited to observe parallel pumping systems. The introduced simulation environment can represent the actual operation of parallel pumps in reasonably accuracy. There by the circuitry based simulation can be used as a researching tool to develop new controlling ways for parallel pumps.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective The authors have sought to study the calibration of a clinical PKA meter (Diamentor E2) and a calibrator for clinical meters (PDC) in the Laboratory of Ionizing Radiation Metrology at Instituto de Energia e Ambiente - Universidade de São Paulo. Materials and Methods Different qualities of both incident and transmitted beams were utilized in conditions similar to a clinical setting, analyzing the influence from the reference dosimeter, from the distance between meters, from the filtration and from the average beam energy. Calibrations were performed directly against a standard 30 cm3 cylindrical chamber or a parallel-plate monitor chamber, and indirectly against the PDC meter. Results The lowest energy dependence was observed for transmitted beams. The cross calibration between the Diamentor E2 and the PDC meters, and the PDC presented the greatest propagation of uncertainties. Conclusion The calibration coefficient of the PDC meter showed to be more stable with voltage, while the Diamentor E2 calibration coefficient was more variable. On the other hand, the PDC meter presented greater uncertainty in readings (5.0%) than with the use of the monitor chamber (3.5%) as a reference.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective To evaluate the accuracy of computed tomography for local and lymph node staging of Wilms' tumor. Materials and Methods Each case of Wilms' tumor was evaluated for the presence of abdominal lymph nodes by a radiologist. Signs of capsule and adjacent organ invasion were analyzed. Surgical and histopathological results were taken as the gold standard. Results Sensitivity was 100% for both mesenteric and retroperitoneal lymph nodes detection, and specificity was, respectively, 12% and 33%, with positive predictive value of 8% and 11% and negative predictive value of 100%. Signs of capsular invasion presented sensitivity of 87%, specificity of 77%, positive predictive value of 63% and negative predictive value of 93%. Signs of adjacent organ invasion presented sensitivity of 100%, specificity of 78%, positive predictive value of 37% and negative predictive value of 100%. Conclusion Computed tomography tumor showed low specificity and low positive predictive value in the detection of lymph node dissemination. The absence of detectable lymph nodes makes their presence unlikely, and likewise regarding the evaluation of local behavior of tumors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AbstractObjective:To compare the accuracy of computer-aided ultrasound (US) and magnetic resonance imaging (MRI) by means of hepatorenal gradient analysis in the evaluation of nonalcoholic fatty liver disease (NAFLD) in adolescents.Materials and Methods:This prospective, cross-sectional study evaluated 50 adolescents (aged 11–17 years), including 24 obese and 26 eutrophic individuals. All adolescents underwent computer-aided US, MRI, laboratory tests, and anthropometric evaluation. Sensitivity, specificity, positive and negative predictive values and accuracy were evaluated for both imaging methods, with subsequent generation of the receiver operating characteristic (ROC) curve and calculation of the area under the ROC curve to determine the most appropriate cutoff point for the hepatorenal gradient in order to predict the degree of steatosis, utilizing MRI results as the gold-standard.Results:The obese group included 29.2% girls and 70.8% boys, and the eutrophic group, 69.2% girls and 30.8% boys. The prevalence of NAFLD corresponded to 19.2% for the eutrophic group and 83% for the obese group. The ROC curve generated for the hepatorenal gradient with a cutoff point of 13 presented 100% sensitivity and 100% specificity. As the same cutoff point was considered for the eutrophic group, false-positive results were observed in 9.5% of cases (90.5% specificity) and false-negative results in 0% (100% sensitivity).Conclusion:Computer-aided US with hepatorenal gradient calculation is a simple and noninvasive technique for semiquantitative evaluation of hepatic echogenicity and could be useful in the follow-up of adolescents with NAFLD, population screening for this disease as well as for clinical studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En los tiempos que corren la robótica forma uno de los pilares más importantes en la industria y una gran noticia para los ingenieros es la referente a las ventas de estos, ya que en 2013, unos 179.000 robots industriales se vendieron en todo el mundo, de nuevo un máximo histórico y un 12% más que en 2012 según datos de la IFR (International Federation of Robotics). Junto a esta noticia, la robótica colaborativa entra en juego en el momento que los robots y los seres humanos deben compartir el lugar de trabajo sin que nos veamos excluidos por las maquinas, por lo tanto lo que se intenta es que los robots mejoren la calidad del trabajo al hacerse cargo de los trabajos peligrosos, tediosos y sucios que no son posibles o seguros para los seres humanos. Otro concepto muy importante y directamente relacionado con lo anterior que está muy en boga y se escucha desde hace relativamente poco tiempo es el de la fabrica del futuro o “Factory Of The Future” la cual intenta que los operarios y los robots encuentren la sintonía en el entorno laboral y que los robots se consideren como maquinaria colaborativa y no como sustitutiva, considerándose como uno de los grandes nichos productivos en plena expansión. Dejando a un lado estos conceptos técnicos que nunca debemos olvidar si nuestra carrera profesional va enfocada en este ámbito industrial, el tema central de este proyecto está basado, como no podía ser de otro modo, en la robótica, que junto con la visión artificial, el resultado de esta fusión, ha dado un manipulador robótico al que se le ha dotado de cierta “inteligencia”. Se ha planteado un sencillo pero posible proceso de producción el cual es capaz de almacenar piezas de diferente forma y color de una forma autónoma solamente guiado por la imagen capturada con una webcam integrada en el equipo. El sistema consiste en una estructura soporte delimitada por una zona de trabajo en la cual se superponen unas piezas diseñadas al efecto las cuales deben ser almacenadas en su lugar correspondiente por el manipulador robótico. Dicho manipulador de cinemática paralela está basado en la tecnología de cables, comandado por cuatro motores que le dan tres grados de libertad (±X, ±Y, ±Z) donde el efector se encuentra suspendido sobre la zona de trabajo moviéndose de forma que es capaz de identificar las características de las piezas en situación, color y forma para ser almacenadas de una forma ordenada según unas premisas iníciales.