965 resultados para Speaker verification
Resumo:
EU on käynnistämässä ympäristöteknologioiden verifiointijärjestelmää, jonka avulla voidaan tarjota käyttäjille ja sijoittajille riippumatonta tietoa innovatiivisten teknologioiden toimivuudesta ja suorituskyvystä ja parantaa niiden markkina-asemaa. Verifioinnilla tarkoitetaan kolmannen osapuolen suorittamaa prosessia tai mekanismia, jonka avulla tuotteen toiminta ja suorituskyky voidaan todentaa. Kansallinen ympäristöteknologioiden verifiointijärjestelmä on käytössä mm. Yhdysvalloissa ja Kanadassa. Euroopassa järjestelmä otettaneen käyttöön vuonna 2011–2012. Suomessa tehdään nykyisin noin 300 pilaantuneen maan puhdistushanketta koskevaa lupa- ja ilmoituspäätöstä vuosittain. Noin 85 prosentissa kohteista käytetään kunnostusmenetelmänä massanvaihtoa. Massanvaihto tulee ainakin toistaiseksi säilymään yleisimpänä kunnostusmenetelmänä, mutta mm. in situ -menetelmien käytön arvioidaan lisääntyvän. Tämän diplomityön tavoitteena oli arvioida voidaanko verifiointia hyödyntää pilaantuneiden maiden laadukkaan käsittelyn edistämisessä ja voiko verifiointi nopeuttaa innovatiivisten kunnostusmenetelmien markkinoillepääsyä. Aihetta tarkasteltiin mm. kahden erityyppisen pilaantuneen maan ja pohjaveden kunnostusmenetelmän, reaktiivisten seinämien (in situ) ja bitumistabiloinnin (ex situ) kautta. Pilaantuneiden maiden kunnostusmenetelmien toimivuus riippuu monista eri tekijöistä, joista osaa ei voida hallita tai mallintaa luotettavasti. Verifiointi soveltuukin parhaiten laitteiden tai PIMA-menetelmiä yksinkertaisempien puhdistusmenetelmien suorituskyvyn todentamiseen. Verifiointi saattaa kuitenkin hyvin toimia PIMA-kunnostuksen kohdalla esimerkiksi tiedollisena ohjauskeinona. Reaktiivisten seinämien ja bitumistabiloinnin verifioinnin työvaiheet ovat hyvin samankaltaiset, suurimpana erona seinämien kohdalla tulee kuvata myös kohde johon seinämä on asennettu. Reaktiivisten seinämien toiminta on riippuvaista monista ympäristötekijöistä, toisin kuin erillisellä laitteistolla suoritettavan bitumistabiloinnin. Tulosten perusteella voidaan yleistää, että verifiointi soveltuu paremmin ex situ -, kuin in situ -kunnostusmenetelmille.
Resumo:
Aquest treball pretén ser l’estudi i la correcció fonètica d’un parlant de castellà com a primera llengua que té la llengua catalana coma segona llengua i presenta un problema a l’hora de pronunciar les laterals palatals. Per tant, és l’estudi d’un cas molt concret. Per a fer aquest estudi s’ha seguit el mètode verbotonal de correcció fonètica, ja que també es pretén comprovar la funcionalitat d’aquest mètode i els resultats possibles
Resumo:
In this paper we describe three melodic patterns of absolute interrogatives from a phonetic point of view, obtained from a corpus in Goiás (Brazil). The patterns are: a) Rising Final Inflection (30% to 52%), b) Rising-Falling Final Inflection, c) High Nucleus Final Inflection. These patterns have been established from the acoustic analysis and standardisation of 55 questions and from the verification of their validity in a perception test. We compared them with interrogative patterns obtained in different parts of Brazil and also in two Romance languages, Spanish and Catalan.
Resumo:
En aquest projecte s’han dissenyat i simulat diferents models de tags RFID per a la banda UHF sobre diferents classes de substrats i tintes conductores amb l’objectiu d’estudiar la viabilitat de la tecnologia de Printed Electronics per a la seva d’implementació física. A partir de dues configuracions ja existents a la literatura, aquestes etiquetes RFID s’han modelat electromagnèticament mitjançant el software ADS i s’ha simulat la seva resposta freqüencial. En segon terme, a fi d’avaluar el seu rendiment, també s’ha representat el read range d’aquests tags RFID en funció d’aquestes tintes conductores i substrats. Posteriorment, s’han realitzat diferents proves de fabricació mitjançant un mètode basat en la serigrafia, així com d’obtenció experimental de la seva distància de lectura. Finalment, en base als resultats obtinguts s’ha pogut concloure que és viable realitzar tags RFID segons aquesta tècnica d’impressió, però a falta d’una verificació experimental únicament a nivell de simulació.
Resumo:
Human activities have resulted in increased nutrient levels in many rivers all over Europe. Sustainable management of river basins demands an assessment of the causes and consequences of human alteration of nutrient flows, together with an evaluation of management options. In the context of an integrated and interdisciplinary environmental assessment (IEA) of nutrient flows, we present and discuss the application of the nutrient emission model MONERIS (MOdelling Nutrient Emissions into River Systems) to the Catalan river basin, La Tordera (north-east Spain), for the period 1996–2002. After a successful calibration and verification process (Nash-Sutcliffe efficiencies E=0.85 for phosphorus and E=0.86 for nitrogen), the application of the model MONERIS proved to be useful in estimating nutrient loads. Crucial for model calibration, in-stream retention was estimated to be about 50 % of nutrient emissions on an annual basis. Through this process, we identified the importance of point sources for phosphorus emissions (about 94% for 1996–2002), and diffuse sources, especially inputs via groundwater, for nitrogen emissions (about 31% for 1996–2002). Despite hurdles related to model structure, observed loads, and input data encountered during the modelling process, MONERIS provided a good representation of the major interannual and spatial patterns in nutrient emissions. An analysis of the model uncertainty and sensitivity to input data indicates that the model MONERIS, even in data-starved Mediterranean catchments, may be profitably used by water managers for evaluating quantitative nutrient emission scenarios for the purpose of managing river basins. As an example of scenario modelling, an analysis of the changes in nutrient emissions through two different future scenarios allowed the identification of a set of relevant measures to reduce nutrient loads.
Resumo:
Digitoitu 14. 1. 2009
Resumo:
This paper reports the development of an easy, fast and effective procedure for the verification of the ideal gas law in splitless injection systems in order to improve the response. Results of a group of pesticides were used to demonstrate the suitability of the approach. The procedure helps establish experimental parameters through theoretical aspects. The improved instrumental response allowed extraction with lower sample volumes, the minimization of time and costs and the simplification of sample preparation.
Resumo:
This dissertation considers the segmental durations of speech from the viewpoint of speech technology, especially speech synthesis. The idea is that better models of segmental durations lead to higher naturalness and better intelligibility. These features are the key factors for better usability and generality of synthesized speech technology. Even though the studies are based on a Finnish corpus the approaches apply to all other languages as well. This is possibly due to the fact that most of the studies included in this dissertation are about universal effects taking place on utterance boundaries. Also the methods invented and used here are suitable for any other study of another language. This study is based on two corpora of news reading speech and sentences read aloud. The other corpus is read aloud by a 39-year-old male, whilst the other consists of several speakers in various situations. The use of two corpora is twofold: it involves a comparison of the corpora and a broader view on the matters of interest. The dissertation begins with an overview to the phonemes and the quantity system in the Finnish language. Especially, we are covering the intrinsic durations of phonemes and phoneme categories, as well as the difference of duration between short and long phonemes. The phoneme categories are presented to facilitate the problem of variability of speech segments. In this dissertation we cover the boundary-adjacent effects on segmental durations. In initial positions of utterances we find that there seems to be initial shortening in Finnish, but the result depends on the level of detail and on the individual phoneme. On the phoneme level we find that the shortening or lengthening only affects the very first ones at the beginning of an utterance. However, on average, the effect seems to shorten the whole first word on the word level. We establish the effect of final lengthening in Finnish. The effect in Finnish has been an open question for a long time, whilst Finnish has been the last missing piece for it to be a universal phenomenon. Final lengthening is studied from various angles and it is also shown that it is not a mere effect of prominence or an effect of speech corpus with high inter- and intra-speaker variation. The effect of final lengthening seems to extend from the final to the penultimate word. On a phoneme level it reaches a much wider area than the initial effect. We also present a normalization method suitable for corpus studies on segmental durations. The method uses an utterance-level normalization approach to capture the pattern of segmental durations within each utterance. This prevents the impact of various problematic variations within the corpora. The normalization is used in a study on final lengthening to show that the results on the effect are not caused by variation in the material. The dissertation shows an implementation and prowess of speech synthesis on a mobile platform. We find that the rule-based method of speech synthesis is a real-time software solution, but the signal generation process slows down the system beyond real time. Future aspects of speech synthesis on limited platforms are discussed. The dissertation considers ethical issues on the development of speech technology. The main focus is on the development of speech synthesis with high naturalness, but the problems and solutions are applicable to any other speech technology approaches.
Resumo:
Tämä diplomityö toteutettiin Sammet Dampers Oy:ltä saatuna toimeksiantona. Yritys haluaa yhä parempia tuloksia tuoteryhmien kehitysprojekteista, jolloin se asettaa vaatimuksia kehitysprojekteissa käytettävälle kehitysprosessille. Yrityksen täytyy optimoida ja systematisoida käytettävää menetelmää, jotta näihin parempiin tuloksiin voidaan päästä. Työn ensimmäisenä tavoitteena on optimoida yrityksen käytössä oleva tuoteryhmien kehitysprojekteissa käytettävä prosessimalli. Tavoitteen mukaisesti työssä luodaan uusi optimoitu tuoteryhmien kehitysprosessimalli, joka vastaa yrityksen tarpeisiin. Tämä uusi malli kirjataan osaksi yrityksen toiminnanohjausjärjestelmää. Työn toisena tavoitteena on käyttää uutta optimoitua prosessimallia kellopeltien tuoteryhmän kehitysprojektissa. Tätä kehitysprojektia käytetään samalla uuden prosessimallin sisäänajamiseen osaksi yrityksen toimintoja.Tämän diplomityön puitteissa kellopeltien kehitysprojektista käydään läpi kehitysprojektin ensimmäinen osio eli vaatimustenmäärittelyprosessi ja esitellään sen tuloksena syntynyt toteutussuunnitelma. Työn tuloksena syntyneen uuden tuoteryhmien kehitysprojektin prosessimallin avulla voidaan saavuttaa merkittäviä parannuksia tarkasteltaessa kehitysprojektin tuloksia ajankäytön, laadun ja kustannusten suhteen.
Resumo:
Nowadays software testing and quality assurance have a great value in software development process. Software testing does not mean a concrete discipline, it is the process of validation and verification that starts from the idea of future product and finishes at the end of product’s maintenance. The importance of software testing methods and tools that can be applied on different testing phases is highly stressed in industry. The initial objectives for this thesis were to provide a sufficient literature review on different testing phases and for each of the phases define the method that can be effectively used for improving software’s quality. Software testing phases, chosen for study are: unit testing, integration testing, functional testing, system testing, acceptance testing and usability testing. The research showed that there are many software testing methods that can be applied at different phases and in the most of the cases the choice of the method should be done depending on software type and its specification. In the thesis the problem, concerned to each of the phases was identified; the method that can help in eliminating this problem was suggested and particularly described.
Resumo:
In this thesis three experiments with atomic hydrogen (H) at low temperatures T<1 K are presented. Experiments were carried out with two- (2D) and three-dimensional (3D) H gas, and with H atoms trapped in solid H2 matrix. The main focus of this work is on interatomic interactions, which have certain specific features in these three systems considered. A common feature is the very high density of atomic hydrogen, the systems are close to quantum degeneracy. Short range interactions in collisions between atoms are important in gaseous H. The system of H in H2 differ dramatically because atoms remain fixed in the H2 lattice and properties are governed by long-range interactions with the solid matrix and with H atoms. The main tools in our studies were the methods of magnetic resonance, with electron spin resonance (ESR) at 128 GHz being used as the principal detection method. For the first time in experiments with H in high magnetic fields and at low temperatures we combined ESR and NMR to perform electron-nuclear double resonance (ENDOR) as well as coherent two-photon spectroscopy. This allowed to distinguish between different types of interactions in the magnetic resonance spectra. Experiments with 2D H gas utilized the thermal compression method in homogeneous magnetic field, developed in our laboratory. In this work methods were developed for direct studies of 3D H at high density, and for creating high density samples of H in H2. We measured magnetic resonance line shifts due to collisions in the 2D and 3D H gases. First we observed that the cold collision shift in 2D H gas composed of atoms in a single hyperfine state is much smaller than predicted by the mean-field theory. This motivated us to carry out similar experiments with 3D H. In 3D H the cold collision shift was found to be an order of magnitude smaller for atoms in a single hyperfine state than that for a mixture of atoms in two different hyperfine states. The collisional shifts were found to be in fair agreement with the theory, which takes into account symmetrization of the wave functions of the colliding atoms. The origin of the small shift in the 2D H composed of single hyperfine state atoms is not yet understood. The measurement of the shift in 3D H provides experimental determination for the difference of the scattering lengths of ground state atoms. The experiment with H atoms captured in H2 matrix at temperatures below 1 K originated from our work with H gas. We found out that samples of H in H2 were formed during recombination of gas phase H, enabling sample preparation at temperatures below 0.5 K. Alternatively, we created the samples by electron impact dissociation of H2 molecules in situ in the solid. By the latter method we reached highest densities of H atoms reported so far, 3.5(5)x1019 cm-3. The H atoms were found to be stable for weeks at temperatures below 0.5 K. The observation of dipolar interaction effects provides a verification for the density measurement. Our results point to two different sites for H atoms in H2 lattice. The steady-state nuclear polarizations of the atoms were found to be non-thermal. The possibility for further increase of the impurity H density is considered. At higher densities and lower temperatures it might be possible to observe phenomena related to quantum degeneracy in solid.
Resumo:
Tässä työssä keskitytään Metso Mineralsin valmistamien tuotteiden suunnitteluympäristöön ja erityisesti sen kehittämismahdollisuuksiin uusien teknologioiden avulla. Käyttövarmuuden parantamiseksi yksi potentiaalisimmista keinoista on parantaa laitteiden huollettavuutta. Tyypillisesti huollettavuuden verifiointi tapahtuu vasta prototyyppien testauksessa työmaaolosuhteissa, jolloin havaittujen ongelmien muuttaminen on hankalaa. Siksi huollettavuuteen tulisi kiinnittää enemmän huomiota jo tuotekehitysvaiheessa. Tässä työssä selvitetään virtuaalitekniikoiden tarjoamia mahdollisuuksia em. asioiden parantamiseksi. Työn toisena osakokonaisuutena tutkitaan riskianalyysien kehittämismahdollisuuksia. Tavoitteena on kehittää toimiva menetelmä riskianalyysin suorittamiseen virtuaaliprototyyppien avulla ja testata kaupallisten riskianalyysiin tarkoitettujen sovellusten käyttöä. Lopuksi tarkastellaan vielä uusien menetelmien integrointia osaksi tuotekehitysprosessia. Tehtyjen Case-tutkimusten perusteella havaittiin, että virtuaaliympäristöjen ja -tekniikoiden avulla saavutetaan hyötyä tuotekehityksen varhaisessa vaiheessa. Kyselytutkimuksesta saadun palautteen perusteella virtuaalitekniikoiden sovelluksille annettiin asteikolla 1–5 yleisarvosanaksi keskimäärin kolme. Virtuaalitekniikoiden ja riskianalyysien yhteiskäyttöä testattiin onnistuneesti, mutta tämä vaatii vielä kehittämistä.
Resumo:
What makes necessary truths true? I argue that all truth supervenes on how things are, and that necessary truths are no exception. What makes them true are proofs. But if so, the notion of proof needs to be generalized to include verification-transcendent proofs, proofs whose correctness exceeds our ability to verify it. It is incumbent on me, therefore, to show that arguments, such as Dummett's, that verification-truth is not compatible with the theory of meaning, are mistaken. The answer is that what we can conceive and construct far outstrips our actual abilities. I conclude by proposing a proof-theoretic account of modality, rejecting a claim of Armstrong's that modality can reside in non-modal truthmakers.
Resumo:
Evaluative sentences (moral judgments, expressions of taste, epistemic modals) are relative to the speaker's standards. Lately, a phenomenon has challenged the traditional explanation of this relativity: whenever two speakers disagree over them they contradict each other without being at fault. Hence, it is thought that the correction of the assertions involved must be relative to an unprivileged standard not necessarily the speaker's. I will claim instead that so far, neither this nor any other proposal has provided an explanation of the phenomenon. I will point out several problems presented by them and I will hint to how this phenomenon could be explained by making minor adjustments to our semantic theory.
Resumo:
Crystallization is a purification method used to obtain crystalline product of a certain crystal size. It is one of the oldest industrial unit processes and commonly used in modern industry due to its good purification capability from rather impure solutions with reasonably low energy consumption. However, the process is extremely challenging to model and control because it involves inhomogeneous mixing and many simultaneous phenomena such as nucleation, crystal growth and agglomeration. All these phenomena are dependent on supersaturation, i.e. the difference between actual liquid phase concentration and solubility. Homogeneous mass and heat transfer in the crystallizer would greatly simplify modelling and control of crystallization processes, such conditions are, however, not the reality, especially in industrial scale processes. Consequently, the hydrodynamics of crystallizers, i.e. the combination of mixing, feed and product removal flows, and recycling of the suspension, needs to be thoroughly investigated. Understanding of hydrodynamics is important in crystallization, especially inlargerscale equipment where uniform flow conditions are difficult to attain. It is also important to understand different size scales of mixing; micro-, meso- and macromixing. Fast processes, like nucleation and chemical reactions, are typically highly dependent on micro- and mesomixing but macromixing, which equalizes the concentrations of all the species within the entire crystallizer, cannot be disregarded. This study investigates the influence of hydrodynamics on crystallization processes. Modelling of crystallizers with the mixed suspension mixed product removal (MSMPR) theory (ideal mixing), computational fluid dynamics (CFD), and a compartmental multiblock model is compared. The importance of proper verification of CFD and multiblock models is demonstrated. In addition, the influence of different hydrodynamic conditions on reactive crystallization process control is studied. Finally, the effect of extreme local supersaturation is studied using power ultrasound to initiate nucleation. The present work shows that mixing and chemical feeding conditions clearly affect induction time and cluster formation, nucleation, growth kinetics, and agglomeration. Consequently, the properties of crystalline end products, e.g. crystal size and crystal habit, can be influenced by management of mixing and feeding conditions. Impurities may have varying impacts on crystallization processes. As an example, manganese ions were shown to replace magnesium ions in the crystal lattice of magnesium sulphate heptahydrate, increasing the crystal growth rate significantly, whereas sodium ions showed no interaction at all. Modelling of continuous crystallization based on MSMPR theory showed that the model is feasible in a small laboratoryscale crystallizer, whereas in larger pilot- and industrial-scale crystallizers hydrodynamic effects should be taken into account. For that reason, CFD and multiblock modelling are shown to be effective tools for modelling crystallization with inhomogeneous mixing. The present work shows also that selection of the measurement point, or points in the case of multiprobe systems, is crucial when process analytical technology (PAT) is used to control larger scale crystallization. The thesis concludes by describing how control of local supersaturation by highly localized ultrasound was successfully applied to induce nucleation and to control polymorphism in reactive crystallization of L-glutamic acid.