84 resultados para acceptance testing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Alzheimer’s disease (AD) is the most common form of dementia. Characteristic changes in an AD brain are the formation of β-amyloid protein (Aβ) plaques and neurofibrillary tangles, though other alterations in the brain have also been connected to AD. No cure is available for AD and it is one of the leading causes of death among the elderly in developed countries. Liposomes are biocompatible and biodegradable spherical phospholipid bilayer vesicles that can enclose various compounds. Several functional groups can be attached on the surface of liposomes in order to achieve long-circulating target-specific liposomes. Liposomes can be utilized as drug carriers and vehicles for imaging agents. Positron emission tomography (PET) is a non-invasive imaging method to study biological processes in living organisms. In this study using nucleophilic 18F-labeling synthesis, various synthesis approaches and leaving groups for novel PET imaging tracers have been developed to target AD pathology in the brain. The tracers were the thioflavin derivative [18F]flutemetamol, curcumin derivative [18F]treg-curcumin, and functionalized [18F]nanoliposomes, which all target Aβ in the AD brain. These tracers were evaluated using transgenic AD mouse models. In addition, 18F-labeling synthesis was developed for a tracer targeting the S1P3 receptor. The chosen 18F-fluorination strategy had an effect on the radiochemical yield and specific activity of the tracers. [18F]Treg-curcumin and functionalized [18F]nanoliposomes had low uptake in AD mouse brain, whereas [18F]flutemetamol exhibited the appropriate properties for preclinical Aβ-imaging. All of these tracers can be utilized in studies of the pathology and treatment of AD and related diseases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Life cycle assessment (LCA) is one of the most established quantitative tools for environmental impact assessment of products. To be able to provide support to environmentally-aware decision makers on environmental impacts of biomass value-chains, the scope of LCA methodology needs to be augmented to cover landuse related environmental impacts. This dissertation focuses on analysing and discussing potential impact assessment methods, conceptual models and environmental indicators that have been proposed to be implemented into the LCA framework for impacts of land use. The applicability of proposed indicators and impact assessment frameworks is tested from practitioners' perspective, especially focusing on forest biomass value chains. The impacts of land use on biodiversity, resource depletion, climate change and other ecosystem services is analysed and discussed and the interplay in between value choices in LCA modelling and the decision-making situations to be supported is critically discussed. It was found out that land use impact indicators are necessary in LCA in highlighting differences in impacts from distinct land use classes. However, many open questions remain on certainty of highlighting actual impacts of land use, especially regarding impacts of managed forest land use on biodiversity and ecosystem services such as water regulation and purification. The climate impact of energy use of boreal stemwood was found to be higher in the short term and lower in the long-term in comparison with fossil fuels that emit identical amount of CO2 in combustion, due to changes implied to forest C stocks. The climate impacts of energy use of boreal stemwood were found to be higher than the previous estimates suggest on forest residues and stumps. The product lifetime was found to have much higher influence on the climate impacts of woodbased value chains than the origin of stemwood either from thinnings or final fellings. Climate neutrality seems to be likely only in the case when almost all the carbon of harvested wood is stored in long-lived wooden products. In the current form, the land use impacts cannot be modelled with a high degree of certainty nor communicated with adequate level of clarity to decision makers. The academia needs to keep on improving the modelling framework, and more importantly, clearly communicate to decision-makers the limited certainty on whether land-use intensive activities can help in meeting the strict mitigation targets we are globally facing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cardiac troponins (cTn) I and T are the current golden standard biochemical markers in the diagnosis and risk stratification of patients with suspected acute coronary syndrome. During the past few years, novel assays capable of detecting cTn‐concentrations in >50% of apparently healthy individuals have become readily available. With the emerging of these high sensitivity cTn assays, reductions in the assay specificity have caused elevations in the measured cTn levels that do not correlate with the clinical picture of the patient. The increased assay sensitivity may reveal that various analytical interference mechanisms exist. This doctoral thesis focused on developing nanoparticle‐assisted immunometric assays that could possibly be applied to an automated point‐of‐care system. The main objective was to develop minimally interference‐prone assays for cTnI by employing recombinant antibody fragments. Fast 5‐ and 15‐minute assays for cTnI and D‐dimer, a degradation product of fibrin, based on intrinsically fluorescent nanoparticles were introduced, thus highlighting the versatility of nanoparticles as universally applicable labels. The utilization of antibody fragments in different versions of the developed cTnI‐assay enabled decreases in the used antibody amounts without sacrificing assay sensitivity. In addition, the utilization of recombinant antibody fragments was shown to significantly decrease the measured cTnI concentrations in an apparently healthy population, as well as in samples containing known amounts of potentially interfering factors: triglycerides, bilirubin, rheumatoid factors, or human anti‐mouse antibodies. When determining the specificity of four commercially available antibodies for cTnI, two out of the four cross‐reacted with skeletal troponin I, but caused crossreactivity issues in patient samples only when paired together. In conclusion, the results of this thesis emphasize the importance of careful antibody selection when developing cTnI assays. The results with different recombinant antibody fragments suggest that the utilization of antibody fragments should strongly be encouraged in the immunoassay field, especially with analytes such as cTnI that require highly sensitive assay approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim and design: To evaluate family-based health counseling for young children, and to study the significance of adding parental self-care or the training of professionals to the programs. The effectiveness and acceptability of the programs were evaluated by comparing two new programs with an earlier one. Subjects and methods: The study was carried out in Vantaa, which was divided into three study areas. The subjects consisted of children born in 2008, particularly fi rstborn children, while children born in 2006 formed the historical control. The fi rst of the new programs emphasized oral hygiene and use of fl uoride, and the second program focused on proper diet and use of xylitol. The main outcome measure was mutansstreptococci (MS) in the dental biofi lm of two-year-olds, and the opinions of parents and dental professionals were evaluated using questionnaires. Results: The programs found wide acceptance among dental professionals. There were no group-related differences found in the MS scores of the two-year-olds. However, all groups combined, father’s advanced level of education and child’s proper use of xylitol were associated with negative MS scores. In the opinion of parents, the oral healthcare guidance at least somewhat met their expectations. Conclusions: The present fi ndings suggest that providing training and support for professionals in health education is important. The addition of parental self-care to supplement programs aimed at young children does not improve the program, although it may improve parental readiness to change their own health habits. Counseling for families might be best carried out through a routine patient-centered program.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Point-of-care (POC) –diagnostics is a field with rapidly growing market share. As these applications become more widely used, there is an increasing pressure to improve their performance to match the one of a central laboratory tests. Lanthanide luminescence has been widely utilized in diagnostics because of the numerous advantages gained by the utilization of time-resolved or anti-Stokes detection. So far the use of lanthanide labels in POC has been scarce due to limitations set by the instrumentation required for their detection and the shortcomings, e.g. low brightness, of these labels. Along with the advances in the research of lanthanide luminescence, and in the field of semiconductors, these materials are becoming a feasible alternative for the signal generation also in the future POC assays. The aim of this thesis was to explore ways of utilizing time-resolved detection or anti-Stokes detection in POC applications. The long-lived fluorescence for the time-resolved measurement can be produced with lanthanide chelates. The ultraviolet (UV) excitation required by these chelates is cumbersome to produce with POC compatible fluorescence readers. In this thesis the use of a novel light-harvesting ligand was studied. This molecule can be used to excite Eu(III)-ions at wavelengths extending up to visible part of the spectrum. An enhancement solution based on this ligand showed a good performance in a proof-of-concept -bioaffinity assay and produced a bright signal upon 365 nm excitation thanks to the high molar absorptivity of the chelate. These features are crucial when developing miniaturized readers for the time-resolved detection of fluorescence. Upconverting phosphors (UCPs) were studied as an internal light source in glucose-sensing dry chemistry test strips and ways of utilizing their various emission wavelengths and near-infrared excitation were explored. The use of nanosized NaYF :Yb3+,Tm3+-particles enabled the replacement of an external UV-light source with a NIR-laser and gave an additional degree of freedom in the optical setup of the detector instrument. The new method enabled a blood glucose measurement with results comparable to a current standard method of measuring reflectance. Microsized visible emitting UCPs were used in a similar manner, but with a broad absorbing indicator compound filtering the excitation and emission wavelengths of the UCP. This approach resulted in a novel way of benefitting from the non-linear relationship between the excitation power and emission intensity of the UCPs, and enabled the amplification of the signal response from the indicator dye.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ohjelmistotestauksen merkitys on kasvanut sen mukaan mitä enemmän ohjelmisto-tuotteet vaikuttavat jokapäiväisesseen elämämme. Tämän vuoksi yritysten investointien ja laadunvarmentamisen yhteys on ilmeinen. Organisaatiot panostavat yhä enemmän ei–funktionaaliseen testaukseen, kuten turvallisuuden, suorituskyvyn ja käytettävyyden testaamiseen. Tämän työn tarkoituksena on tutkia ohjelmistotestauksen nykytilannetta Suomessa. Syy tähän on uudistaa ja parantaa ohjelmistotestauksen kurssitarjontaa Turun yliopistossa vastaamaan parhaalla mahdollisella tavalla yritysten tarvetta. Opinnäyte on toteutettu replikaatio-tutkimuksena. Pääosa kyselystä sisältää kysymyksiä ohjelmistotestauksen menetelmistä ja työkaluista testausprosessin toimintojen aikana. Lisäksi on yleisiä kysymyksiä yrityksistä ja niiden ohjelmistotestausympäristöistä. Kyselyssä otetaan myös kantaa yritysten käyttämiin monenlaisiin testaus-tasoihin, -tyyppeihin ja testauksessa kohdattuihin haasteisiin. Tämä opinnäyte perustuu testausprosessistandardeihin. Ohjelmistotestausstandardit ovat keskeisessä asemassa tässä työssä, vaikka ne ovat olleet viime aikoina vahvan kritiikin kohteena. Epäilys standardien välttämättömyyteen on syntynyt muutoksista ohjelmistokehityksessä. Tämä työ esittelee tulokset ohjelmistotestauksen käytännöistä. Tuloksia on verrattu aiheeseen liittyvän aiemman kyselyn (Lee, Kang, & Lee, 2011) tuloksiin. Ajanpuutteen havaitaan olevan suuri haaste ohjelmistotestauksessa. Ketterä ohjelmistokehitys on saavuttanut suosiota kaikissa vastaajien yrityksissä. Testauksen menetelmät ja työkalut testauksen arviointiin, suunnitteluun ja raportointiin ovat hyvin vähäisessä käytössä. Toisaalta testauksen menetelmien ja työkalujen käyttö automaattiseen testauksen toteuttamiseen ja virheiden hallintaan on lisääntynyt. Järjestelmä-, hyväksyntä-, yksikkö- ja integraatiotestaus ovat käytössä kaikkien vastaajien edustamissa yrityksissä. Kaikkien vastaajien mielestä regressio- sekä tutkiva- ja ei-funktionaalinen testaus ovat tärkeitä tekniikoita.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study develops an approach that tries to validate software functionality to work systems needs in SMEs. The formulated approach is constructed by using a SAAS based software i.e., work collaboration service (WCS), and SMEs as the elements of study. Where the WCS’s functionality is qualified to the collaboration needs that exist in operational and project work within SMEs. For this research constructivist approach and case study method is selected because the nature of the current study requires an in depth study of the work collaboration service as well as a detailed study of the work systems within different enterprises. Four different companies are selected in which fourteen interviews are conducted to gather data pertaining. The work systems method and framework are used as a central part of the approach to collect, analyze and interpret the enterprises work systems model and the underlying collaboration needs on operational and project work. On the other hand, the functional model of the WCS and its functionality is determined from functional model analysis, software testing, documentation and meetings with the service vendor. The enterprise work system model and the WCS model are compared to reveal how work progression differs between the two and make visible unaddressed stages of work progression. The WCS functionality is compared to work systems collaboration needs to ascertain if the service will suffice the needs of the project and operational work under study. The unaddressed needs provide opportunities to improve the functionality of the service for better conformity to the needs of enterprise and work. The results revealed that the functional models actually differed in how operational and project work progressed within the stages. WCS shared similar stages of work progression apart from the stages of identification and acceptance, and progress and completion stages were only partially addressed. Conclusion is that the identified unaddressed needs such as, single point of reference, SLA and OLA inclusion etc., should be implemented or improved within the WCS at appropriate stages of work to gain better compliance of the service to the needs of the enterprise an work itself. The developed approach can hence be used to carry out similar analysis for the conformance of pre-built software functionality to work system needs with SMEs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Today, the user experience and usability in software application are becoming a major design issue due to the adaptation of many processes using new technologies. Therefore, the study of the user experience and usability might be included in every software development project and, thus, they should be tested to get traceable results. As a result of different testing methods to evaluate the concepts, a non-expert on the topic might have doubts on which option he/she should opt for and how to interpret the outcomes of the process. This work aims to create a process to ease the whole testing methodology based on the process created by Seffah et al. and a supporting software tool to follow the procedure of these testing methods for the user experience and usability.