993 resultados para automated testing
Resumo:
Cloud computing is a practically relevant paradigm in computing today. Testing is one of the distinct areas where cloud computing can be applied. This study addressed the applicability of cloud computing for testing within organizational and strategic contexts. The study focused on issues related to the adoption, use and effects of cloudbased testing. The study applied empirical research methods. The data was collected through interviews with practitioners from 30 organizations and was analysed using the grounded theory method. The research process consisted of four phases. The first phase studied the definitions and perceptions related to cloud-based testing. The second phase observed cloud-based testing in real-life practice. The third phase analysed quality in the context of cloud application development. The fourth phase studied the applicability of cloud computing in the gaming industry. The results showed that cloud computing is relevant and applicable for testing and application development, as well as other areas, e.g., game development. The research identified the benefits, challenges, requirements and effects of cloud-based testing; and formulated a roadmap and strategy for adopting cloud-based testing. The study also explored quality issues in cloud application development. As a special case, the research included a study on applicability of cloud computing in game development. The results can be used by companies to enhance the processes for managing cloudbased testing, evaluating practical cloud-based testing work and assessing the appropriateness of cloud-based testing for specific testing needs.
Resumo:
A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.
Resumo:
Recently, due to the increasing total construction and transportation cost and difficulties associated with handling massive structural components or assemblies, there has been increasing financial pressure to reduce structural weight. Furthermore, advances in material technology coupled with continuing advances in design tools and techniques have encouraged engineers to vary and combine materials, offering new opportunities to reduce the weight of mechanical structures. These new lower mass systems, however, are more susceptible to inherent imbalances, a weakness that can result in higher shock and harmonic resonances which leads to poor structural dynamic performances. The objective of this thesis is the modeling of layered sheet steel elements, to accurately predict dynamic performance. During the development of the layered sheet steel model, the numerical modeling approach, the Finite Element Analysis and the Experimental Modal Analysis are applied in building a modal model of the layered sheet steel elements. Furthermore, in view of getting a better understanding of the dynamic behavior of layered sheet steel, several binding methods have been studied to understand and demonstrate how a binding method affects the dynamic behavior of layered sheet steel elements when compared to single homogeneous steel plate. Based on the developed layered sheet steel model, the dynamic behavior of a lightweight wheel structure to be used as the structure for the stator of an outer rotor Direct-Drive Permanent Magnet Synchronous Generator designed for high-power wind turbines is studied.
Resumo:
Genetic counselling is a process in which the counsellee receives information and support concerning a genetic disease. This study examines the genetic counselling attached to genetic testing. Since genetic information is increasing alongside new testing technologies and the situations faced at the genetic clinics will therefore be more diverse, it is essential to assess what the expectations directed at genetic counselling are. It is also important to compare how they face the current counselling practices. In this study, the expectations, frames and practices of genetic counselling in different contexts of genetic testing were examined from three different perspectives: First, international guidelines covering genetic counselling were analysed to summarise what is expected from genetic counselling and to study how genetic information is framed. Second, national experts in European countries were asked about the regulations and practices of genetic counselling in their country. Finally, ten counsellees who had visited a genetic clinic were interviewed to analyse their expectations and experiences. The counsellees’ perspective was also approached through the background review of the previous studies on counsellees’ experiences. On the basis of the study, there are basic elements that are expected to be covered in genetic counselling from all perspectives. However, the views concerning bioethics, genetic exceptionalism and psychosocial aspects vary depending on the perspective and on the individual situation. Since there are sometimes more differences than similarities between genetic tests, no universal recommendations for counselling can be applied. The practices of genetic counselling should be defined situationally, emphasising the individual needs over the genes.
Resumo:
Cardiac troponin (cTn) I and T are the recommended biomarkers for the diagnosis and risk stratification of patients with suspected acute coronary syndrome (ACS), a major cause of cardiovascular death and disability worldwide. It has recently been demonstrated that cTn-specific autoantibodies (cTnAAb) can negatively interfere with cTnI detection by immunoassays to the extent that cTnAAb-positive patients may be falsely designated as cTnI-negative. The aim of this thesis was to develop and optimize immunoassays for the detection of both cTnI and cTnAAb, which would eventually enable exploring the clinical impact of these autoantibodies on cTnI testing and subsequent patient management. The extent of cTnAAb interference in different cTnI assay configurations and the molecular characteristics of cTnAAbs were investigated in publications I and II, respectively. The findings showed that cTnI midfragment targeting immunoassays used predominantly in clinical practice are affected by cTnAAb interference which can be circumvented by using a novel 3+1-type assay design with three capture antibodies against the N-terminus, midfragment and C-terminus and one tracer antibody against the C-terminus. The use of this assay configuration was further supported by the epitope specificity study, which showed that although the midfragment is most commonly targeted by cTnAAbs, the interference basically encompasses the whole molecule, and there may be remarkable individual variation at the affected sites. In publications III and IV, all the data obtained in previous studies were utilized to develop an improved version of an existing cTnAAb assay and a sensitive cTnI assay free of this specific analytical interference. The results of the thesis showed that approximately one in 10 patients with suspected ACS have detectable amounts of cTnAAbs in their circulation and that cTnAAbs can inhibit cTnI determination when targeted against the binding sites of assay antibodies used in its immunological detection. In the light of these observations, the risk of clinical misclassification caused by the presence of cTnAAbs remains a valid and reasonable concern. Because the titers, affinities and epitope specificities of cTnAAbs and the concentration of endogenous cTnI determine the final effect of circulating cTnAAbs, appropriately sized studies on their clinical significance are warranted. The new cTnI and cTnAAb assays could serve as analytical tools for establishing the impact of cTnAAbs on cTnI testing and also for unraveling the etiology of cTn-related autoimmune responses.
Resumo:
Esitys KDK-käytettävyystyöryhmän järjestämässä seminaarissa: Miten käyttäjien toiveet haastavat metatietokäytäntöjämme? / How users' expectations challenge our metadata practices? 30.9.2014.
Resumo:
Seven selection indexes based on the phenotypic value of the individual and the mean performance of its family were assessed for their application in breeding of self-pollinated plants. There is no clear superiority from one index to another although some show one or more negative aspects, such as favoring the selection of a top performing plant from an inferior family in detriment of an excellent plant from a superior family
Resumo:
In the latter days, human activities constantly increase greenhouse gases emissions in the atmosphere, which has a direct impact on a global climate warming. Finland as European Union member, developed national structural plan to promote renewable energy generation, pursuing the aspects of Directive 2009/28/EC and put it on the sharepoint. Finland is on a way of enhancing national security of energy supply, increasing diversity of the energy mix. There are plenty significant objectives to develop onshore and offshore wind energy generation in country for a next few decades, as well as another renewable energy sources. To predict the future changes, there are a lot of scenario methods developed and adapted to energy industry. The Master’s thesis explored “Fuzzy cognitive maps” approach in scenarios developing, which captures expert’s knowledge in a graphical manner and using these captures for a raw scenarios testing and refinement. There were prospects of Finnish wind energy development for the year of 2030 considered, with aid of FCM technique. Five positive raw scenarios were developed and three of them tested against integrated expert’s map of knowledge, using graphical simulation. The study provides robust scenarios out of the preliminary defined, as outcome, assuming the impact of results, taken after simulation. The thesis was conducted in such way, that there will be possibilities to use existing knowledge captures from expert panel, to test and deploy different sets of scenarios regarding to Finnish wind energy development.
Virtual Testing of Active Magnetic Bearing Systems based on Design Guidelines given by the Standards
Resumo:
Active Magnetic Bearings offer many advantages that have brought new applications to the industry. However, similarly to all new technology, active magnetic bearings also have downsides and one of those is the low standardization level. This thesis is studying mainly the ISO 14839 standard and more specifically the system verification methods. These verifying methods are conducted using a practical test with an existing active magnetic bearing system. The system is simulated with Matlab using rotor-bearing dynamics toolbox, but this study does not include the exact simulation code or a direct algebra calculation. However, this study provides the proof that standardized simulation methods can be applied in practical problems.
Resumo:
The aim of this study is to test the accrual-based model suggested by Dechow et al. (1995) in order to detect and compare earnings management practices in Finnish and French companies. Also the impact of financial crisis of 2008 on earnings management behavior in these countries is tested by dividing the whole time period of 2003-2012 into two sub-periods: pre-crisis (2003-2008) and post-crisis (2009-2012). Results support the idea that companies in both countries have significant earnings management practices. During the post-crisis period companies in Finland show income inflating practices, while in France the opposite tendency is noticed (income deflating) during the same period. Results of the assumption that managers in highly concentrated companies are engaged in income enhancing practices vary in two countries. While in Finland managers are trying to show better performance for bonuses or other contractual compensation motivations, in France they avoid paying dividends or high taxes.
Resumo:
Rats infected with the helminth Capillaria hepatica regularly develop septal hepatic fibrosis that may progress to cirrhosis in a relatively short time. Because of such characteristics, this experimental model was selected for testing drugs exhibiting antifibrosis potential, such as pentoxifylline, gadolinium chloride and vitamin A. Hepatic fibrosis was qualitatively and quantitatively evaluated in liver samples obtained by partial hepatectomy and at autopsy. The material was submitted to histological, biochemical and morphometric methods. A statistically significant reduction of fibrosis was obtained with pentoxifylline when administered intraperitoneally rather than intravenously. Gadolinium chloride showed moderate activity when administered prophylactically (before fibrosis had started), but showed a poor effect when fibrosis was well advanced. No modification of fibrosis was seen after vitamin A administration. Hydroxyproline content was correlated with morphometric measurements. The model appears to be adequate, since few animals die of the infection, fibrosis develops regularly in all animals, and the effects of different antifibrotic drugs and administration protocols can be easily detected.
Resumo:
Alzheimer’s disease (AD) is the most common form of dementia. Characteristic changes in an AD brain are the formation of β-amyloid protein (Aβ) plaques and neurofibrillary tangles, though other alterations in the brain have also been connected to AD. No cure is available for AD and it is one of the leading causes of death among the elderly in developed countries. Liposomes are biocompatible and biodegradable spherical phospholipid bilayer vesicles that can enclose various compounds. Several functional groups can be attached on the surface of liposomes in order to achieve long-circulating target-specific liposomes. Liposomes can be utilized as drug carriers and vehicles for imaging agents. Positron emission tomography (PET) is a non-invasive imaging method to study biological processes in living organisms. In this study using nucleophilic 18F-labeling synthesis, various synthesis approaches and leaving groups for novel PET imaging tracers have been developed to target AD pathology in the brain. The tracers were the thioflavin derivative [18F]flutemetamol, curcumin derivative [18F]treg-curcumin, and functionalized [18F]nanoliposomes, which all target Aβ in the AD brain. These tracers were evaluated using transgenic AD mouse models. In addition, 18F-labeling synthesis was developed for a tracer targeting the S1P3 receptor. The chosen 18F-fluorination strategy had an effect on the radiochemical yield and specific activity of the tracers. [18F]Treg-curcumin and functionalized [18F]nanoliposomes had low uptake in AD mouse brain, whereas [18F]flutemetamol exhibited the appropriate properties for preclinical Aβ-imaging. All of these tracers can be utilized in studies of the pathology and treatment of AD and related diseases.
Resumo:
Life cycle assessment (LCA) is one of the most established quantitative tools for environmental impact assessment of products. To be able to provide support to environmentally-aware decision makers on environmental impacts of biomass value-chains, the scope of LCA methodology needs to be augmented to cover landuse related environmental impacts. This dissertation focuses on analysing and discussing potential impact assessment methods, conceptual models and environmental indicators that have been proposed to be implemented into the LCA framework for impacts of land use. The applicability of proposed indicators and impact assessment frameworks is tested from practitioners' perspective, especially focusing on forest biomass value chains. The impacts of land use on biodiversity, resource depletion, climate change and other ecosystem services is analysed and discussed and the interplay in between value choices in LCA modelling and the decision-making situations to be supported is critically discussed. It was found out that land use impact indicators are necessary in LCA in highlighting differences in impacts from distinct land use classes. However, many open questions remain on certainty of highlighting actual impacts of land use, especially regarding impacts of managed forest land use on biodiversity and ecosystem services such as water regulation and purification. The climate impact of energy use of boreal stemwood was found to be higher in the short term and lower in the long-term in comparison with fossil fuels that emit identical amount of CO2 in combustion, due to changes implied to forest C stocks. The climate impacts of energy use of boreal stemwood were found to be higher than the previous estimates suggest on forest residues and stumps. The product lifetime was found to have much higher influence on the climate impacts of woodbased value chains than the origin of stemwood either from thinnings or final fellings. Climate neutrality seems to be likely only in the case when almost all the carbon of harvested wood is stored in long-lived wooden products. In the current form, the land use impacts cannot be modelled with a high degree of certainty nor communicated with adequate level of clarity to decision makers. The academia needs to keep on improving the modelling framework, and more importantly, clearly communicate to decision-makers the limited certainty on whether land-use intensive activities can help in meeting the strict mitigation targets we are globally facing.
Resumo:
The aim of this master’s thesis is to research and analyze how purchase invoice processing can be automated and streamlined in a system renewal project. The impacts of workflow automation on invoice handling are studied by means of time, cost and quality aspects. Purchase invoice processing has a lot of potential for automation because of its labor-intensive and repetitive nature. As a case study combining both qualitative and quantitative methods, the topic is approached from a business process management point of view. The current process was first explored through interviews and workshop meetings to create a holistic understanding of the process at hand. Requirements for process streamlining were then researched focusing on specified vendors and their purchase invoices, which helped to identify the critical factors for successful invoice automation. To optimize the flow from invoice receipt to approval for payment, the invoice receiving process was outsourced and the automation functionalities of the new system utilized in invoice handling. The quality of invoice data and the need of simple structured purchase order (PO) invoices were emphasized in the system testing phase. Hence, consolidated invoices containing references to multiple PO or blanket release numbers should be simplified in order to use automated PO matching. With non-PO invoices, it is important to receive the buyer reference details in an applicable invoice data field so that automation rules could be created to route invoices to a review and approval flow. In the beginning of the project, invoice processing was seen ineffective both time- and cost-wise, and it required a lot of manual labor to carry out all tasks. In accordance with testing results, it was estimated that over half of the invoices could be automated within a year after system implementation. Processing times could be reduced remarkably, which would then result savings up to 40 % in annual processing costs. Due to several advancements in the purchase invoice process, business process quality could also be perceived as improved.
Resumo:
The present study describes an auxiliary tool in the diagnosis of left ventricular (LV) segmental wall motion (WM) abnormalities based on color-coded echocardiographic WM images. An artificial neural network (ANN) was developed and validated for grading LV segmental WM using data from color kinesis (CK) images, a technique developed to display the timing and magnitude of global and regional WM in real time. We evaluated 21 normal subjects and 20 patients with LVWM abnormalities revealed by two-dimensional echocardiography. CK images were obtained in two sets of viewing planes. A method was developed to analyze CK images, providing quantitation of fractional area change in each of the 16 LV segments. Two experienced observers analyzed LVWM from two-dimensional images and scored them as: 1) normal, 2) mild hypokinesia, 3) moderate hypokinesia, 4) severe hypokinesia, 5) akinesia, and 6) dyskinesia. Based on expert analysis of 10 normal subjects and 10 patients, we trained a multilayer perceptron ANN using a back-propagation algorithm to provide automated grading of LVWM, and this ANN was then tested in the remaining subjects. Excellent concordance between expert and ANN analysis was shown by ROC curve analysis, with measured area under the curve of 0.975. An excellent correlation was also obtained for global LV segmental WM index by expert and ANN analysis (R² = 0.99). In conclusion, ANN showed high accuracy for automated semi-quantitative grading of WM based on CK images. This technique can be an important aid, improving diagnostic accuracy and reducing inter-observer variability in scoring segmental LVWM.