965 resultados para Speaker verification


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sisäilman laatuun vaikuttaa moni eri tekijä. Sisäilma on tavallisesti laadultaan hyvää silloin, kun tilojen käyttäjät ovat siihen tyytyväisiä. Hyvän sisäilman saavuttamiseksi tulee ilmanvaihdon, lämmityslaitteiden, siivouksen ja talon rakenteiden olla kunnossa. Tämä työ on tehty Helsingin kaupungin Tilakeskukselle, ja se on osa vuoden jatkuvaa projektia, jossa selvitetään siivouksen riittävyyttä ja sen vaikutusta koulujen sisäilmaan. Työn tavoitteena on luoda siivouksen laadunvarmistusmenetelmä, jonka avulla voidaan varmistaa toteutunut siivouksen taso. Lisäksi projektissa selvitetään, pystytäänkö siivouksen palvelusopimus toteuttamaan käytännössä kunnolla. Työssä kehitettävät menetelmät ovat subjektiivinen ja objektiivinen pölyisyyden arviointi sekä kyselylomake koulujen opettajille. Työssä kehitetyn menetelmän laatutasojen raja-arvot noudattavat INSTA 800 -standardissa esitettäviä arvoja. Kehitettyä menetelmää käytettiin suoritetuissa lähtö- ja nollatasomittauksissa. Mittauksista saadut tulokset vastasivat silmämääräisesti tiloissa tehtyjä havaintoja.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this work is to obtain a better understanding of behaviour of possible ultrasound appliance on fluid media mixing. The research is done in the regard to Newtonian and non-Newtonian fluids. The process of ultrasound appliance on liquids is modelled in COMSOL Multiphysics software. The influence of ultrasound using is introduced as waveform equation. Turbulence modelling is fulfilled by the k-ε model in Newtonian fluid. The modeling of ultrasound assisted mixing in non-Newtonian fluids is based on the power law. To verify modelling results two practical methods are used: Particle Image Velocimetry and measurements of mixing time. Particle Image Velocimetry allows capturing of velocity flow field continuously and presents detailed depiction of liquid dynamics. The second way of verification is the comparison of mixing time of homogeneity. Experimentally achievement of mixing time is done by conductivity measurements. In modelling part mixing time is achieved by special module of COMSOL Multiphysics – the transport of diluted species. Both practical and modelling parts show similar radial mechanism of fluid flow under ultrasound appliance – from the horn tip fluid moves to the bottom and along the walls goes back. Velocity profiles are similar in modelling and experimental part in the case of Newtonian fluid. In the case of non-Newtonian fluid velocity profiles do not agree. The development track of ultrasound-assisted mixing modelling is presented in the thesis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Centrifugal pumps are one of the major energy consuming end-devices in developed coun-tries both in industrial and services sectors. According to recent studies, even 30 % of the energy used in pumping systems could be saved by more careful choosing of devices and system design. One of the most efficient and affordable ways to decrease the energy con-sumption of the pumping system is to substitute traditionally used flow control methods, like valve control, with modern variable speed drive (VSD) control. In this thesis, Microsoft Excel based program, Savings Calculator for Centrifugal Pumps (SCCP), is designed. SCCP calculates the achievable energy and financial savings when throttle control is substituted by VSD control in the pumping system. Compared to the sim-ilar existing programs, the goal is to make SCCP calculations more accurate and require less input information. Also some useful additional features are added to the designed program to make it more user friendly. The reliability of the calculations of designed program seem to vary depending on case. The results are corresponding accurately to the laboratory measurements, but there occurs high deviations in some cases, when the results are compared to the pump information specified by manufacturer. On the basis of verification in this thesis, SCCP seems to be at least as accurate as similar existing programs and it can be used as help in investment decision whether to have VSD or not.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Rolling element bearings are essential components of rotating machinery. The spherical roller bearing (SRB) is one variant seeing increasing use, because it is self-aligning and can support high loads. It is becoming increasingly important to understand how the SRB responds dynamically under a variety of conditions. This doctoral dissertation introduces a computationally efficient, three-degree-of-freedom, SRB model that was developed to predict the transient dynamic behaviors of a rotor-SRB system. In the model, bearing forces and deflections were calculated as a function of contact deformation and bearing geometry parameters according to nonlinear Hertzian contact theory. The results reveal how some of the more important parameters; such as diametral clearance, the number of rollers, and osculation number; influence ultimate bearing performance. Distributed defects, such as the waviness of the inner and outer ring, and localized defects, such as inner and outer ring defects, are taken into consideration in the proposed model. Simulation results were verified with results obtained by applying the formula for the spherical roller bearing radial deflection and the commercial bearing analysis software. Following model verification, a numerical simulation was carried out successfully for a full rotor-bearing system to demonstrate the application of this newly developed SRB model in a typical real world analysis. Accuracy of the model was verified by comparing measured to predicted behaviors for equivalent systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Active Magnetic Bearings offer many advantages that have brought new applications to the industry. However, similarly to all new technology, active magnetic bearings also have downsides and one of those is the low standardization level. This thesis is studying mainly the ISO 14839 standard and more specifically the system verification methods. These verifying methods are conducted using a practical test with an existing active magnetic bearing system. The system is simulated with Matlab using rotor-bearing dynamics toolbox, but this study does not include the exact simulation code or a direct algebra calculation. However, this study provides the proof that standardized simulation methods can be applied in practical problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis concentrates on the validation of a generic thermal hydraulic computer code TRACE under the challenges of the VVER-440 reactor type. The code capability to model the VVER-440 geometry and thermal hydraulic phenomena specific to this reactor design has been examined and demonstrated acceptable. The main challenge in VVER-440 thermal hydraulics appeared in the modelling of the horizontal steam generator. The major challenge here is not in the code physics or numerics but in the formulation of a representative nodalization structure. Another VVER-440 specialty, the hot leg loop seals, challenges the system codes functionally in general, but proved readily representable. Computer code models have to be validated against experiments to achieve confidence in code models. When new computer code is to be used for nuclear power plant safety analysis, it must first be validated against a large variety of different experiments. The validation process has to cover both the code itself and the code input. Uncertainties of different nature are identified in the different phases of the validation procedure and can even be quantified. This thesis presents a novel approach to the input model validation and uncertainty evaluation in the different stages of the computer code validation procedure. This thesis also demonstrates that in the safety analysis, there are inevitably significant uncertainties that are not statistically quantifiable; they need to be and can be addressed by other, less simplistic means, ultimately relying on the competence of the analysts and the capability of the community to support the experimental verification of analytical assumptions. This method completes essentially the commonly used uncertainty assessment methods, which are usually conducted using only statistical methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Extant research on exchange-listed firms has acknowledged that the concentration of ownership and the identity of owners make a difference. In addition, studies indicate that firms with a dominant owner outperform firms with dispersed ownership. During the last few years, scholars have identified one group of owners, in particular, whose ownership stake in publicly listed firm is positively related to performance: the business family. While acknowledging that family firms represent a unique organizational form, scholars have identified various concepts and theories in order to understand how the family influences organizational processes and firm performance. Despite multitude of research, scholars have not been able to present clear results on how firm performance is actually impacted by the family. In other words, studies comparing the performance of listed family and other types of firms have remained descriptive in nature since they lack empirical data and confirmation from the family business representatives. What seems to be missing is a convincing theory that links the involvement and behavioral consequences. Accordingly, scholars have not yet come to a mutual understanding of what precisely constitutes a family business. The variety of different definitions and theories has made comparability of different results difficult for instance. These two issues have hampered the development of a rigorous theory of family business. The overall objective of this study is to describe and understand how the family as a dominant owner can enhance firm performance, and can act a source of sustainable success in listed companies. In more detail, in order to develop understanding of the unique factors that can act as competitive advantages for listed family firms, this study is based on a qualitative approach and aims at theory development, not theory verification. The data in this study consist of 16 thematic interviews with CEOs, members of the board, supervisory board chairs, and founders of Finnish listed-family firms. The study consists of two parts. The first part introduces the research topic, research paradigm, methods, and publications, and also discusses the overall outcomes and contributions of the publications. The second part consists of four publications that address the research questions from different viewpoints. The analyses of this study indicate that family ownership in listed companies represents a structure that differs from the traditional views of agency and stewardship, as well as from resource-based and stakeholder views. As opposed to these theories and shareholder capitalism which consider humans as individualistic, opportunistic, and self-serving, and assume that the behaviors of an investor are based on the incentives and motivations to maximize private profits, the family owners form a collective social unit that is motivated to act together toward their mutual purpose or benefit. In addition, socio-emotional and psychological elements of ownership define the family members as owners, rather than the legal and financial dimensions of ownership. That is, collective psychological ownership of family over the business (F-CPO) can be seen as a construct that comprehensively captures the fusion between the family and the business. Moreover, it captures the realized, rather than merely potential, family influence on and interaction with the business, and thereby brings more theoretical clarity of the nature of the fusion between the family and the business, and offers a solution to the problem of family business definition. This doctoral dissertation provides academics, policy-makers, family business practitioners, and the society at large with many implications considering family and business relationships.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nowadays, computer-based systems tend to become more complex and control increasingly critical functions affecting different areas of human activities. Failures of such systems might result in loss of human lives as well as significant damage to the environment. Therefore, their safety needs to be ensured. However, the development of safety-critical systems is not a trivial exercise. Hence, to preclude design faults and guarantee the desired behaviour, different industrial standards prescribe the use of rigorous techniques for development and verification of such systems. The more critical the system is, the more rigorous approach should be undertaken. To ensure safety of a critical computer-based system, satisfaction of the safety requirements imposed on this system should be demonstrated. This task involves a number of activities. In particular, a set of the safety requirements is usually derived by conducting various safety analysis techniques. Strong assurance that the system satisfies the safety requirements can be provided by formal methods, i.e., mathematically-based techniques. At the same time, the evidence that the system under consideration meets the imposed safety requirements might be demonstrated by constructing safety cases. However, the overall safety assurance process of critical computerbased systems remains insufficiently defined due to the following reasons. Firstly, there are semantic differences between safety requirements and formal models. Informally represented safety requirements should be translated into the underlying formal language to enable further veri cation. Secondly, the development of formal models of complex systems can be labour-intensive and time consuming. Thirdly, there are only a few well-defined methods for integration of formal verification results into safety cases. This thesis proposes an integrated approach to the rigorous development and verification of safety-critical systems that (1) facilitates elicitation of safety requirements and their incorporation into formal models, (2) simplifies formal modelling and verification by proposing specification and refinement patterns, and (3) assists in the construction of safety cases from the artefacts generated by formal reasoning. Our chosen formal framework is Event-B. It allows us to tackle the complexity of safety-critical systems as well as to structure safety requirements by applying abstraction and stepwise refinement. The Rodin platform, a tool supporting Event-B, assists in automatic model transformations and proof-based verification of the desired system properties. The proposed approach has been validated by several case studies from different application domains.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hissiteollisuudessa nostokoneistoina käytettyjen sähkömoottoreiden laatuvaatimukset ovat tiuken-tuneet viime vuosina. Erityisesti koneistojen tuottama ääni ja mekaaninen värähtely ovat olleet jat-kuvasti tiukentuneen tarkastelun alaisena. Hissikoriin ja hissiä ympäröiviin rakenteisiin välittyvästä värähtelystä johtuva ääni on yksi hissin laatuvaikutelmaan merkittävimmin vaikuttavia tekijöitä. Nostokoneisto on yksi tärkeimmistä äänen ja värähtelyn lähteistä hissijärjestelmässä. Koneiston suunnittelulla edellä mainittuja tekijöitä voidaan minimoida. Sähkökoneiden suunnittelussa finiit-tielementtimenetelmien (FEM) käyttö on vakiintunut haastavimmissa sovelluksissa. Kone Oyj:llä nostokoneistoina käytetään aksiaalivuokestomagneettitahtikoneita (AFPMSM), joiden FEM simu-lointiin käytetään yleisesti kolmea eri tapaa. Kukin näistä vaihtoehdoista pitää sisällään omat hyö-tynsä, että haittansa. Suunnittelun kannalta tärkeää on oikean menetelmän valinta ai-ka/informatiivisuus suhteen maksimoimiseksi. Erittäin tärkeää on myös saatujen tulosten oikeelli-suus. Tämän diplomityön tavoite on kehittää järjestelmä, jonka avulla AFPMS-koneen voimia voidaan mitata yksityiskohtaisella tasolla. Järjestelmän avulla voidaan tarkastella käytössä olevien FE-menetelmien tulosten oikeellisuutta sekä äänen että värähtelyn syntymekanismeja. Järjestelmän tarkoitus on myös syventää Kone Oyj tietotaitoa AFPMS-koneiden toiminnasta. Tässä työssä esitellään AFPMS-koneen epäideaalisuuksia, jotka voivat vaikuttaa mittajärjestelmän suunnitteluun. Myös koneen epäideaalisuuksiin lukeutuvaa ääntä on tarkasteltu tässä työssä. Jotta työn tavoitteiden mukaista FE-menetelmien vertailua ja tulosten oikeellisuuden tarkastelua voitai-siin tehdä, myös yleisimpiä AFPMS-koneen FE-menetelmiä tarkastellaan. Työn tuloksena on mittajärjestelmän suunnitelma, jonka avulla voidaan toteuttaa kuuden vapausas-teen voimamittaus jokaiselle koneistomagneetille alle 1N resoluutiolla. Suunnitellun järjestelmän toimivuutta on tarkasteltu FE-menetelmiä käyttäen ja järjestelmässä käytettävän voima-anturin ky-vykkyyttä on todennettu referenssimittauksin. Suunniteltu mittajärjestelmä mahdollistaa sähkömoottorin useiden eri epäideaalisuuksien tarkaste-lun yksityiskohtaisella tasolla. Mittausajatuksen soveltaminen myös muiden koneiden tutkimiseen tarjoaa mahdollisuuksia jatkotutkimuksille.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The power is still today an issue in wearable computing applications. The aim of the present paper is to raise awareness of the power consumption of wearable computing devices in specific scenarios to be able in the future to design energy efficient wireless sensors for context recognition in wearable computing applications. The approach is based on a hardware study. The objective of this paper is to analyze and compare the total power consumption of three representative wearable computing devices in realistic scenarios such as Display, Speaker, Camera and microphone, Transfer by Wi-Fi, Monitoring outdoor physical activity and Pedometer. A scenario based energy model is also developed. The Samsung Galaxy Nexus I9250 smartphone, the Vuzix M100 Smart Glasses and the SimValley Smartwatch AW-420.RX are the three devices representative of their form factors. The power consumption is measured using PowerTutor, an android energy profiler application with logging option and using unknown parameters so it is adjusted with the USB meter. The result shows that the screen size is the main parameter influencing the power consumption. The power consumption for an identical scenario varies depending on the wearable devices meaning that others components, parameters or processes might impact on the power consumption and further study is needed to explain these variations. This paper also shows that different inputs (touchscreen is more efficient than buttons controls) and outputs (speaker sensor is more efficient than display sensor) impact the energy consumption in different way. This paper gives recommendations to reduce the energy consumption in healthcare wearable computing application using the energy model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resilience is the property of a system to remain trustworthy despite changes. Changes of a different nature, whether due to failures of system components or varying operational conditions, significantly increase the complexity of system development. Therefore, advanced development technologies are required to build robust and flexible system architectures capable of adapting to such changes. Moreover, powerful quantitative techniques are needed to assess the impact of these changes on various system characteristics. Architectural flexibility is achieved by embedding into the system design the mechanisms for identifying changes and reacting on them. Hence a resilient system should have both advanced monitoring and error detection capabilities to recognise changes as well as sophisticated reconfiguration mechanisms to adapt to them. The aim of such reconfiguration is to ensure that the system stays operational, i.e., remains capable of achieving its goals. Design, verification and assessment of the system reconfiguration mechanisms is a challenging and error prone engineering task. In this thesis, we propose and validate a formal framework for development and assessment of resilient systems. Such a framework provides us with the means to specify and verify complex component interactions, model their cooperative behaviour in achieving system goals, and analyse the chosen reconfiguration strategies. Due to the variety of properties to be analysed, such a framework should have an integrated nature. To ensure the system functional correctness, it should rely on formal modelling and verification, while, to assess the impact of changes on such properties as performance and reliability, it should be combined with quantitative analysis. To ensure scalability of the proposed framework, we choose Event-B as the basis for reasoning about functional correctness. Event-B is a statebased formal approach that promotes the correct-by-construction development paradigm and formal verification by theorem proving. Event-B has a mature industrial-strength tool support { the Rodin platform. Proof-based verification as well as the reliance on abstraction and decomposition adopted in Event-B provides the designers with a powerful support for the development of complex systems. Moreover, the top-down system development by refinement allows the developers to explicitly express and verify critical system-level properties. Besides ensuring functional correctness, to achieve resilience we also need to analyse a number of non-functional characteristics, such as reliability and performance. Therefore, in this thesis we also demonstrate how formal development in Event-B can be combined with quantitative analysis. Namely, we experiment with integration of such techniques as probabilistic model checking in PRISM and discrete-event simulation in SimPy with formal development in Event-B. Such an integration allows us to assess how changes and di erent recon guration strategies a ect the overall system resilience. The approach proposed in this thesis is validated by a number of case studies from such areas as robotics, space, healthcare and cloud domain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Orthogonal design was employed to study the effect of extraction time, temperature and liquid-to-solid ratio on the production of antioxidant polysaccharides from leaves of Gynura bicolor (PLG). Analysis of variance was performed on the data obtained. The most relevant variable was extraction time. A liquid-solid ratio of 30:1 (v/w), a temperature of 80 °C and an extraction time of 3 h were found to be optimal for PLG. The optimal extraction yield of 4.9% was obtained through additional verification test. Hydroxyl radical-scavenging activity, reducing power and ferrous ion chelating ability of PLG were determined. PLG possess concentration-dependent antioxidant potency and IC50 of PLG was 4.67, 0.24 and 4.31 mg/mL for hydroxyl radical-scavenging and ferric ion chelating abilities as well as reducing power, respectively. The results suggest that G. bicolor polysaccharides could be potential source of natural antioxidant and be contributor to the health benefits of G. bicolor.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It has long been known that amino acids are the building blocks for proteins and govern their folding into specific three-dimensional structures. However, the details of this process are still unknown and represent one of the main problems in structural bioinformatics, which is a highly active research area with the focus on the prediction of three-dimensional structure and its relationship to protein function. The protein structure prediction procedure encompasses several different steps from searches and analyses of sequences and structures, through sequence alignment to the creation of the structural model. Careful evaluation and analysis ultimately results in a hypothetical structure, which can be used to study biological phenomena in, for example, research at the molecular level, biotechnology and especially in drug discovery and development. In this thesis, the structures of five proteins were modeled with templatebased methods, which use proteins with known structures (templates) to model related or structurally similar proteins. The resulting models were an important asset for the interpretation and explanation of biological phenomena, such as amino acids and interaction networks that are essential for the function and/or ligand specificity of the studied proteins. The five proteins represent different case studies with their own challenges like varying template availability, which resulted in a different structure prediction process. This thesis presents the techniques and considerations, which should be taken into account in the modeling procedure to overcome limitations and produce a hypothetical and reliable three-dimensional structure. As each project shows, the reliability is highly dependent on the extensive incorporation of experimental data or known literature and, although experimental verification of in silico results is always desirable to increase the reliability, the presented projects show that also the experimental studies can greatly benefit from structural models. With the help of in silico studies, the experiments can be targeted and precisely designed, thereby saving both money and time. As the programs used in structural bioinformatics are constantly improved and the range of templates increases through structural genomics efforts, the mutual benefits between in silico and experimental studies become even more prominent. Hence, reliable models for protein three-dimensional structures achieved through careful planning and thoughtful executions are, and will continue to be, valuable and indispensable sources for structural information to be combined with functional data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this study was to contribute to the current knowledge-based theory by focusing on a research gap that exists in the empirically proven determination of the simultaneous but differentiable effects of intellectual capital (IC) assets and knowledge management (KM) practices on organisational performance (OP). The analysis was built on the past research and theoreticised interactions between the latent constructs specified using the survey-based items that were measured from a sample of Finnish companies for IC and KM and the dependent construct for OP determined using information available from financial databases. Two widely used and commonly recommended measures in the literature on management science, i.e. the return on total assets (ROA) and the return on equity (ROE), were calculated for OP. Thus the investigation of the relationship between IC and KM impacting OP in relation to the hypotheses founded was possible to conduct using objectively derived performance indicators. Using financial OP measures also strengthened the dynamic features of data needed in analysing simultaneous and causal dependences between the modelled constructs specified using structural path models. The estimates were obtained for the parameters of structural path models using a partial least squares-based regression estimator. Results showed that the path dependencies between IC and OP or KM and OP were always insignificant when analysed separate to any other interactions or indirect effects caused by simultaneous modelling and regardless of the OP measure used that was either ROA or ROE. The dependency between the constructs for KM and IC appeared to be very strong and was always significant when modelled simultaneously with other possible interactions between the constructs and using either ROA or ROE to define OP. This study, however, did not find statistically unambiguous evidence for proving the hypothesised causal mediation effects suggesting, for instance, that the effects of KM practices on OP are mediated by the IC assets. Due to the fact that some indication about the fluctuations of causal effects was assessed, it was concluded that further studies are needed for verifying the fundamental and likely hidden causal effects between the constructs of interest. Therefore, it was also recommended that complementary modelling and data processing measures be conducted for elucidating whether the mediation effects occur between IC, KM and OP, the verification of which requires further investigations of measured items and can be build on the findings of this study.