943 resultados para Compression Testing
Resumo:
The 6-minute walk test (6MWT) is a simple field test that is widely used in clinical settings to assess functional exercise capacity. However, studies with healthy subjects are scarce. We hypothesized that the 6MWT might be useful to assess exercise capacity in healthy subjects. The purpose of this study was to evaluate 6MWT intensity in middle-aged and older adults, as well as to develop a simple equation to predict oxygen uptake ( V ˙ O 2 ) from the 6-min walk distance (6MWD). Eighty-six participants, 40 men and 46 women, 40-74 years of age and with a mean body mass index of 28±6 kg/m2, performed the 6MWT according to American Thoracic Society guidelines. Physiological responses were evaluated during the 6MWT using a K4b2 Cosmed telemetry gas analyzer. On a different occasion, the subjects performed ramp protocol cardiopulmonary exercise testing (CPET) on a treadmill. Peak V ˙ O 2 in the 6MWT corresponded to 78±13% of the peak V ˙ O 2 during CPET, and the maximum heart rate corresponded to 80±23% of that obtained in CPET. Peak V ˙ O 2 in CPET was adequately predicted by the 6MWD by a linear regression equation: V ˙ O 2 mL·min-1·kg-1 = -2.863 + (0.0563×6MWDm) (R2=0.76). The 6MWT represents a moderate-to-high intensity activity in middle-aged and older adults and proved to be useful for predicting cardiorespiratory fitness in the present study. Our results suggest that the 6MWT may also be useful in asymptomatic individuals, and its use in walk-based conditioning programs should be encouraged.
Resumo:
We investigated the diagnostic value of the apparent diffusion coefficient (ADC) and fractional anisotropy (FA) of magnetic resonance diffusion tensor imaging (DTI) in patients with spinal cord compression (SCC) using a meta-analysis framework. Multiple scientific literature databases were exhaustively searched to identify articles relevant to this study. Mean values and standardized mean differences (SMDs) were calculated for the ADC and FA in normal and diseased tissues. The STATA version 12.0 software was used for statistical analysis. Of the 41 articles initially retrieved through database searches, 11 case-control studies were eligible for the meta-analysis and contained a combined total of 645 human subjects (394 patients with SCC and 251 healthy controls). All 11 studies reported data on FA, and 9 contained data related to the ADC. The combined SMDs of the ADC and FA showed that the ADC was significantly higher and the FA was lower in patients with SCC than in healthy controls. Subgroup analysis based on the b value showed higher ADCs in patients with SCC than in healthy controls at b values of both ≤500 and >500 s/mm2. In summary, the main findings of this meta-analysis revealed an increased ADC and decreased FA in patients with SCC, indicating that DTI is an important diagnostic imaging tool to assess patients suspected to have SCC.
Resumo:
Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.
Resumo:
Point-of-care (POC) –diagnostics is a field with rapidly growing market share. As these applications become more widely used, there is an increasing pressure to improve their performance to match the one of a central laboratory tests. Lanthanide luminescence has been widely utilized in diagnostics because of the numerous advantages gained by the utilization of time-resolved or anti-Stokes detection. So far the use of lanthanide labels in POC has been scarce due to limitations set by the instrumentation required for their detection and the shortcomings, e.g. low brightness, of these labels. Along with the advances in the research of lanthanide luminescence, and in the field of semiconductors, these materials are becoming a feasible alternative for the signal generation also in the future POC assays. The aim of this thesis was to explore ways of utilizing time-resolved detection or anti-Stokes detection in POC applications. The long-lived fluorescence for the time-resolved measurement can be produced with lanthanide chelates. The ultraviolet (UV) excitation required by these chelates is cumbersome to produce with POC compatible fluorescence readers. In this thesis the use of a novel light-harvesting ligand was studied. This molecule can be used to excite Eu(III)-ions at wavelengths extending up to visible part of the spectrum. An enhancement solution based on this ligand showed a good performance in a proof-of-concept -bioaffinity assay and produced a bright signal upon 365 nm excitation thanks to the high molar absorptivity of the chelate. These features are crucial when developing miniaturized readers for the time-resolved detection of fluorescence. Upconverting phosphors (UCPs) were studied as an internal light source in glucose-sensing dry chemistry test strips and ways of utilizing their various emission wavelengths and near-infrared excitation were explored. The use of nanosized NaYF :Yb3+,Tm3+-particles enabled the replacement of an external UV-light source with a NIR-laser and gave an additional degree of freedom in the optical setup of the detector instrument. The new method enabled a blood glucose measurement with results comparable to a current standard method of measuring reflectance. Microsized visible emitting UCPs were used in a similar manner, but with a broad absorbing indicator compound filtering the excitation and emission wavelengths of the UCP. This approach resulted in a novel way of benefitting from the non-linear relationship between the excitation power and emission intensity of the UCPs, and enabled the amplification of the signal response from the indicator dye.
Resumo:
The subject of the thesis is automatic sentence compression with machine learning, so that the compressed sentences remain both grammatical and retain their essential meaning. There are multiple possible uses for the compression of natural language sentences. In this thesis the focus is generation of television program subtitles, which often are compressed version of the original script of the program. The main part of the thesis consists of machine learning experiments for automatic sentence compression using different approaches to the problem. The machine learning methods used for this work are linear-chain conditional random fields and support vector machines. Also we take a look which automatic text analysis methods provide useful features for the task. The data used for machine learning is supplied by Lingsoft Inc. and consists of subtitles in both compressed an uncompressed form. The models are compared to a baseline system and comparisons are made both automatically and also using human evaluation, because of the potentially subjective nature of the output. The best result is achieved using a CRF - sequence classification using a rich feature set. All text analysis methods help classification and most useful method is morphological analysis. Tutkielman aihe on suomenkielisten lauseiden automaattinen tiivistäminen koneellisesti, niin että lyhennetyt lauseet säilyttävät olennaisen informaationsa ja pysyvät kieliopillisina. Luonnollisen kielen lauseiden tiivistämiselle on monta käyttötarkoitusta, mutta tässä tutkielmassa aihetta lähestytään television ohjelmien tekstittämisen kautta, johon käytännössä kuuluu alkuperäisen tekstin lyhentäminen televisioruudulle paremmin sopivaksi. Tutkielmassa kokeillaan erilaisia koneoppimismenetelmiä tekstin automaatiseen lyhentämiseen ja tarkastellaan miten hyvin erilaiset luonnollisen kielen analyysimenetelmät tuottavat informaatiota, joka auttaa näitä menetelmiä lyhentämään lauseita. Lisäksi tarkastellaan minkälainen lähestymistapa tuottaa parhaan lopputuloksen. Käytetyt koneoppimismenetelmät ovat tukivektorikone ja lineaarisen sekvenssin mallinen CRF. Koneoppimisen tukena käytetään tekstityksiä niiden eri käsittelyvaiheissa, jotka on saatu Lingsoft OY:ltä. Luotuja malleja vertaillaan Lopulta mallien lopputuloksia evaluoidaan automaattisesti ja koska teksti lopputuksena on jossain määrin subjektiivinen myös ihmisarviointiin perustuen. Vertailukohtana toimii kirjallisuudesta poimittu menetelmä. Tutkielman tuloksena paras lopputulos saadaan aikaan käyttäen CRF sekvenssi-luokittelijaa laajalla piirrejoukolla. Kaikki kokeillut teksin analyysimenetelmät auttavat luokittelussa, joista tärkeimmän panoksen antaa morfologinen analyysi.
Effect of wheat flour protein variations on sensory attributes, texture and staling of Taftoon bread
Resumo:
The quality of flat breads depends in part on the textural properties of breads during storage. These properties are largely affected by flour protein quality and quantity. The present study aimed to examine differences between sensory properties, textural and staling of Tandoori breads made from flours of different quality and different quantities of protein. This was implemented by using three flours with 9.4, 11.5 and 13.5% protein contents and different protein qualities shown by Zeleney sedimentation volume 16.25, 22.75 and 23.25 mL respectively. Bread strips were submitted to uniaxial compression between two parallel plates on an Instron Universal Testing machine, and firmness of the breads was determined. Results indicated the differences in the sensory attributes of breads produced by flours of different protein content and quality, demonstrating that high protein high quality flours are not able to sheet and expand under the high temperature - short time conditions employed in Taftoon bread production and are therefore not suitable for this kind of bread. Results showed that flour with 11.5% protein content, produced bread with better sensory characteristics and acceptable storage time.
Resumo:
Today, the user experience and usability in software application are becoming a major design issue due to the adaptation of many processes using new technologies. Therefore, the study of the user experience and usability might be included in every software development project and, thus, they should be tested to get traceable results. As a result of different testing methods to evaluate the concepts, a non-expert on the topic might have doubts on which option he/she should opt for and how to interpret the outcomes of the process. This work aims to create a process to ease the whole testing methodology based on the process created by Seffah et al. and a supporting software tool to follow the procedure of these testing methods for the user experience and usability.