965 resultados para Speaker verification
Resumo:
To enable a mathematically and physically sound execution of the fatigue test and a correct interpretation of its results, statistical evaluation methods are used to assist in the analysis of fatigue testing data. The main objective of this work is to develop step-by-stepinstructions for statistical analysis of the laboratory fatigue data. The scopeof this project is to provide practical cases about answering the several questions raised in the treatment of test data with application of the methods and formulae in the document IIW-XIII-2138-06 (Best Practice Guide on the Statistical Analysis of Fatigue Data). Generally, the questions in the data sheets involve some aspects: estimation of necessary sample size, verification of the statistical equivalence of the collated sets of data, and determination of characteristic curves in different cases. The series of comprehensive examples which are given in this thesis serve as a demonstration of the various statistical methods to develop a sound procedure to create reliable calculation rules for the fatigue analysis.
Resumo:
Résumé : La radiothérapie par modulation d'intensité (IMRT) est une technique de traitement qui utilise des faisceaux dont la fluence de rayonnement est modulée. L'IMRT, largement utilisée dans les pays industrialisés, permet d'atteindre une meilleure homogénéité de la dose à l'intérieur du volume cible et de réduire la dose aux organes à risque. Une méthode usuelle pour réaliser pratiquement la modulation des faisceaux est de sommer de petits faisceaux (segments) qui ont la même incidence. Cette technique est appelée IMRT step-and-shoot. Dans le contexte clinique, il est nécessaire de vérifier les plans de traitement des patients avant la première irradiation. Cette question n'est toujours pas résolue de manière satisfaisante. En effet, un calcul indépendant des unités moniteur (représentatif de la pondération des chaque segment) ne peut pas être réalisé pour les traitements IMRT step-and-shoot, car les poids des segments ne sont pas connus à priori, mais calculés au moment de la planification inverse. Par ailleurs, la vérification des plans de traitement par comparaison avec des mesures prend du temps et ne restitue pas la géométrie exacte du traitement. Dans ce travail, une méthode indépendante de calcul des plans de traitement IMRT step-and-shoot est décrite. Cette méthode est basée sur le code Monte Carlo EGSnrc/BEAMnrc, dont la modélisation de la tête de l'accélérateur linéaire a été validée dans une large gamme de situations. Les segments d'un plan de traitement IMRT sont simulés individuellement dans la géométrie exacte du traitement. Ensuite, les distributions de dose sont converties en dose absorbée dans l'eau par unité moniteur. La dose totale du traitement dans chaque élément de volume du patient (voxel) peut être exprimée comme une équation matricielle linéaire des unités moniteur et de la dose par unité moniteur de chacun des faisceaux. La résolution de cette équation est effectuée par l'inversion d'une matrice à l'aide de l'algorithme dit Non-Negative Least Square fit (NNLS). L'ensemble des voxels contenus dans le volume patient ne pouvant être utilisés dans le calcul pour des raisons de limitations informatiques, plusieurs possibilités de sélection ont été testées. Le meilleur choix consiste à utiliser les voxels contenus dans le Volume Cible de Planification (PTV). La méthode proposée dans ce travail a été testée avec huit cas cliniques représentatifs des traitements habituels de radiothérapie. Les unités moniteur obtenues conduisent à des distributions de dose globale cliniquement équivalentes à celles issues du logiciel de planification des traitements. Ainsi, cette méthode indépendante de calcul des unités moniteur pour l'IMRT step-andshootest validée pour une utilisation clinique. Par analogie, il serait possible d'envisager d'appliquer une méthode similaire pour d'autres modalités de traitement comme par exemple la tomothérapie. Abstract : Intensity Modulated RadioTherapy (IMRT) is a treatment technique that uses modulated beam fluence. IMRT is now widespread in more advanced countries, due to its improvement of dose conformation around target volume, and its ability to lower doses to organs at risk in complex clinical cases. One way to carry out beam modulation is to sum smaller beams (beamlets) with the same incidence. This technique is called step-and-shoot IMRT. In a clinical context, it is necessary to verify treatment plans before the first irradiation. IMRT Plan verification is still an issue for this technique. Independent monitor unit calculation (representative of the weight of each beamlet) can indeed not be performed for IMRT step-and-shoot, because beamlet weights are not known a priori, but calculated by inverse planning. Besides, treatment plan verification by comparison with measured data is time consuming and performed in a simple geometry, usually in a cubic water phantom with all machine angles set to zero. In this work, an independent method for monitor unit calculation for step-and-shoot IMRT is described. This method is based on the Monte Carlo code EGSnrc/BEAMnrc. The Monte Carlo model of the head of the linear accelerator is validated by comparison of simulated and measured dose distributions in a large range of situations. The beamlets of an IMRT treatment plan are calculated individually by Monte Carlo, in the exact geometry of the treatment. Then, the dose distributions of the beamlets are converted in absorbed dose to water per monitor unit. The dose of the whole treatment in each volume element (voxel) can be expressed through a linear matrix equation of the monitor units and dose per monitor unit of every beamlets. This equation is solved by a Non-Negative Least Sqvare fif algorithm (NNLS). However, not every voxels inside the patient volume can be used in order to solve this equation, because of computer limitations. Several ways of voxel selection have been tested and the best choice consists in using voxels inside the Planning Target Volume (PTV). The method presented in this work was tested with eight clinical cases, which were representative of usual radiotherapy treatments. The monitor units obtained lead to clinically equivalent global dose distributions. Thus, this independent monitor unit calculation method for step-and-shoot IMRT is validated and can therefore be used in a clinical routine. It would be possible to consider applying a similar method for other treatment modalities, such as for instance tomotherapy or volumetric modulated arc therapy.
Resumo:
Tämän Pro gradu -tutkielman tavoitteena on tarkastella hyväksytyn tilintarkastajan roolia asiantuntijana sekä vastuuta ja tehtäviä yrityskaupan yhteydessä suoritettavassa due diligence -tutkimuksessa (DD). Lisäksi tarkastellaan DD:een välillisesti kohdistuvaa sääntelyä, tutkimuksen eri osa-alueita, raportointia ja tutkimusprosessin merkitystä osana yrityskauppaa. Tutkimusmenetelmä on käsiteanalyyttinen. Tutkimuksessa on haastateltu DD:ia suorittavia hyväksyttyjä tilintarkastajia. DD:n tavoitteena on saada mahdollisimman kattava kuva ostettavasta kohteesta, liiketoiminnasta sekä riskeistä ja mahdollisuuksista. Taloudellinen DD keskittyy tilinpäätökseen, kannattavuuteen ja tuloksentuottamiskykyyn. Yrityskauppojen luku- ja euromäärän kasvettua viime vuosina, on due diligencestä tullut erottamaton osa yrityskauppaprosessia. DD:ta on tarkasteltu ostajan, erityisesti pääomasijoittajan, näkökulmasta huutokauppaprosessissa, jolloin ostajalla ei aina ole tuntemusta ostettavan yrityksen toimialasta ja liiketoiminnasta. Tällöin asiantuntijana toimivan tilintarkastajan rooli korostuu erityisesti.
Resumo:
The building industry has a particular interest in using clinching as a joining method for frame constructions of light-frame housing. Normally many clinch joints are required in joining of frames.In order to maximise the strength of the complete assembly, each clinch joint must be as sound as possible. Experimental testing is the main means of optimising a particular clinch joint. This includes shear strength testing and visual observation of joint cross-sections. The manufacturers of clinching equipment normally perform such experimental trials. Finite element analysis can also be used to optimise the tool geometry and the process parameter, X, which represents the thickness of the base of the joint. However, such procedures require dedicated software, a skilled operator, and test specimens in order to verify the finite element model. In addition, when using current technology several hours' computing time may be necessary. The objective of the study was to develop a simple calculation procedure for rapidly establishing an optimum value for the parameter X for a given tool combination. It should be possible to use the procedure on a daily basis, without stringent demands on the skill of the operator or the equipment. It is also desirable that the procedure would significantly decrease thenumber of shear strength tests required for verification. The experimental workinvolved tests in order to obtain an understanding of the behaviour of the sheets during clinching. The most notable observation concerned the stage of the process in which the upper sheet was initially bent, after which the deformation mechanism changed to shearing and elongation. The amount of deformation was measured relative to the original location of the upper sheet, and characterised as the C-measure. By understanding in detail the behaviour of the upper sheet, it waspossible to estimate a bending line function for the surface of the upper sheet. A procedure was developed, which makes it possible to estimate the process parameter X for each tool combination with a fixed die. The procedure is based on equating the volume of material on the punch side with the volume of the die. Detailed information concerning the behaviour of material on the punch side is required, assuming that the volume of die does not change during the process. The procedure was applied to shear strength testing of a sample material. The sample material was continuously hot-dip zinc-coated high-strength constructional steel,with a nominal thickness of 1.0 mm. The minimum Rp0.2 proof stress was 637 N/mm2. Such material has not yet been used extensively in light-frame housing, and little has been published on clinching of the material. The performance of the material is therefore of particular interest. Companies that use clinching on a daily basis stand to gain the greatest benefit from the procedure. By understanding the behaviour of sheets in different cases, it is possible to use data at an early stage for adjusting and optimising the process. In particular, the functionality of common tools can be increased since it is possible to characterise the complete range of existing tools. The study increases and broadens the amount ofbasic information concerning the clinching process. New approaches and points of view are presented and used for generating new knowledge.
Resumo:
Thisresearch deals with the dynamic modeling of gas lubricated tilting pad journal bearings provided with spring supported pads, including experimental verification of the computation. On the basis of a mathematical model of a film bearing, a computer program has been developed, which can be used for the simulation of a special type of tilting pad gas journal bearing supported by a rotary spring under different loading conditions time dependently (transient running conditions due to geometry variations in time externally imposed). On the basis of literature, different transformations have been used in the model to achieve simpler calculation. The numerical simulation is used to solve a non-stationary case of a gasfilm. The simulation results were compared with literature results in a stationary case (steady running conditions) and they were found to be equal. In addition to this, comparisons were made with a number of stationary and non-stationary bearing tests, which were performed at Lappeenranta University of Technology using bearings designed with the simulation program. A study was also made using numerical simulation and literature to establish the influence of the different bearing parameters on the stability of the bearing. Comparison work was done with literature on tilting pad gas bearings. This bearing type is rarely used. One literature reference has studied the same bearing type as that used in LUT. A new design of tilting pad gas bearing is introduced. It is based on a stainless steel body and electron beam welding of the bearing parts. It has good operation characteristics and is easier to tune and faster to manufacture than traditional constructions. It is also suitable for large serial production.
Resumo:
High dynamic performance of an electric motor is a fundamental prerequisite in motion control applications, also known as servo drives. Recent developments in the field of microprocessors and power electronics have enabled faster and faster movements with an electric motor. In such a dynamically demanding application, the dimensioning of the motor differs substantially from the industrial motor design, where feasible characteristics of the motor are for example high efficiency, a high power factor, and a low price. In motion control instead, such characteristics as high overloading capability, high-speed operation, high torque density and low inertia are required. The thesis investigates how the dimensioning of a high-performance servomotor differs from the dimensioning of industrial motors. The two most common servomotor types are examined; an induction motor and apermanent magnet synchronous motor. The suitability of these two motor types indynamically demanding servo applications is assessed, and the design aspects that optimize the servo characteristics of the motors are analyzed. Operating characteristics of a high performance motor are studied, and some methods for improvements are suggested. The main focus is on the induction machine, which is frequently compared to the permanent magnet synchronous motor. A 4 kW prototype induction motor was designed and manufactured for the verification of the simulation results in the laboratory conditions. Also a dynamic simulation model for estimating the thermal behaviour of the induction motor in servo applications was constructed. The accuracy of the model was improved by coupling it with the electromagnetic motor model in order to take into account the variations in the motor electromagnetic characteristics due to the temperature rise.
Resumo:
El creciente número de hablantes no nativos de inglés en el mundo constituye la base a partir de la cual se han hecho recientes afirmaciones alrededor de la necesidad de enfatizar el papel del inglés como lingua franca en las clases de inglés. Este artículo investiga la visión de los profesores catalanes de inglés al respecto mediante un cuestionario centrado en aspectos como la preferencia de profesores nativos o no nativos, el conocimiento cultural idóneo para el profeso- rado de inglés, y la variedad de inglés escogida. Los resultados indican que el profesorado está influido por la supremacía del hablante nativo y una visión del inglés todavía restringida a las comunidades de hablantes nativos, con algunas diferencias significativas encontradas entre distintos grupos de profesores.
Dosimetric comparison of different treatment modalities for stereotactic radiosurgery of meningioma.
Resumo:
BACKGROUND: The objective of this study was to compare the three most prominent systems for stereotactic radiosurgery in terms of dosimetric characteristics: the Cyberknife system, the Gamma Knife Perfexion and the Novalis system. METHODS: Ten patients treated for recurrent grade I meningioma after surgery using the Cyberknife system were identified; the Cyberknife contours were exported and comparative treatment plans were generated for the Novalis system and Gamma Knife Perfexion. Dosimetric values were compared with respect to coverage, conformity index (CI), gradient index (GI) and beam-on time (BOT). RESULTS: All three systems showed comparable results in terms of coverage. The Gamma Knife and the Cyberknife system showed significantly higher levels of conformity than the Novalis system (Cyberknife vs Novalis, p = 0.002; Gamma Knife vs Novalis, p = 0.002). The Gamma Knife showed significantly steeper gradients compared with the Novalis and the Cyberknife system (Gamma Knife vs Novalis, p = 0.014; Gamma Knife vs Cyberknife, p = 0.002) and significantly longer beam-on times than the other two systems (BOT = 66 ± 21.3 min, Gamma Knife vs Novalis, p = 0.002; Gamma Knife vs Cyberknife, p = 0.002). CONCLUSIONS: The multiple focal entry systems (Gamma Knife and Cyberknife) achieve higher conformity than the Novalis system. The Gamma Knife delivers the steepest dose gradient of all examined systems. However, the Gamma Knife is known to require long beam-on times, and despite worse dose gradients, LINAC-based systems (Novalis and Cyberknife) offer image verification at the time of treatment delivery.
Resumo:
Aim of study: To identify species of wood samples based on common names and anatomical analyses of their transversal surfaces (without microscopic preparations). Area of study: Spain and South America Material and methods: The test was carried out on a batch of 15 lumber samples deposited in the Royal Botanical Garden in Madrid, from the expedition by Ruiz and Pavon (1777-1811). The first stage of the methodology is to search and to make a critical analysis of the databases which list common nomenclature along with scientific nomenclature. A geographic filter was then applied to the information resulting from the samples with a more restricted distribution. Finally an anatomical verification was carried out with a pocket microscope with a magnification of x40, equipped with a 50 micrometers resolution scale. Main results: The identification of the wood based exclusively on the common name is not useful due to the high number of alternative possibilities (14 for “naranjo”, 10 for “ébano”, etc.). The common name of one of the samples (“huachapelí mulato”) enabled the geographic origin of the samples to be accurately located to the shipyard area in Guayaquil (Ecuador). Given that Ruiz y Pavon did not travel to Ecuador, the specimens must have been obtained by Tafalla. It was possible to determine correctly 67% of the lumber samples from the batch. In 17% of the cases the methodology did not provide a reliable identification. Research highlights: It was possible to determine correctly 67% of the lumber samples from the batch and their geographic provenance. The identification of the wood based exclusively on the common name is not useful.
Resumo:
After incidentally learning about a hidden regularity, participants can either continue to solve the task as instructed or, alternatively, apply a shortcut. Past research suggests that the amount of conflict implied by adopting a shortcut seems to bias the decision for vs. against continuing instruction-coherent task processing. We explored whether this decision might transfer from one incidental learning task to the next. Theories that conceptualize strategy change in incidental learning as a learning-plus-decision phenomenon suggest that high demands to adhere to instruction-coherent task processing in Task 1 will impede shortcut usage in Task 2, whereas low control demands will foster it. We sequentially applied two established incidental learning tasks differing in stimuli, responses and hidden regularity (the alphabet verification task followed by the serial reaction task, SRT). While some participants experienced a complete redundancy in the task material of the alphabet verification task (low demands to adhere to instructions), for others the redundancy was only partial. Thus, shortcut application would have led to errors (high demands to follow instructions). The low control demand condition showed the strongest usage of the fixed and repeating sequence of responses in the SRT. The transfer results are in line with the learning-plus-decision view of strategy change in incidental learning, rather than with resource theories of self-control.
Resumo:
In this paper we use a Terahertz (THz) time-domain system to image and analyze the structure of an artwork attributed to the Spanish artist Goya painted in 1771. The THz images show features that cannot be seen with optical inspection and complement data obtained with X-ray imaging that provide evidence of its authenticity, which is validated by other independent studies. For instance, a feature with a strong resemblance with one of Goya"s known signatures is seen in the THz images. In particular, this paper demonstrates the potential of THz imaging as a complementary technique along with X-ray for the verification and authentication of artwork pieces through the detection of features that remain hidden to optical inspection.
Resumo:
The networking and digitalization of audio equipment has created a need for control protocols. These protocols offer new services to customers and ensure that the equipment operates correctly. The control protocols used in the computer networks are not directly applicable since embedded systems have resource and cost limitations. In this master's thesis the design and implementation of new loudspeaker control network protocols are presented. The protocol stack was required to be reliable, have short response times, configure the network automatically and support the dynamic addition and removal of loudspeakers. The implemented protocol stack was also required to be as efficient and lightweight as possible because the network nodes are fairly simple and lack processing power. The protocol stack was thoroughly tested, validated and verified. The protocols were formally described using LOTOS (Language of Temporal Ordering Specifications) and verified using reachability analysis. A prototype of the loudspeaker network was built and used for testing the operation and the performance of the control protocols. The implemented control protocol stack met the design specifications and proved to be highly reliable and efficient.
Resumo:
In this article, I address epistemological questions regarding the status of linguistic rules and the pervasive--though seldom discussed--tension that arises between theory-driven object perception by linguists on the one hand, and ordinary speakers' possible intuitive knowledge on the other hand. Several issues will be discussed using examples from French verb morphology, based on the 6500 verbs from Le Petit Robert dictionary (2013).
Resumo:
Diplomityön tavoitteena oli arvioida uutta sellu-, paperi- ja kartonkiteollisuuden Paper Profile -ympäristötuoteselostetta. Paper Profilen tarkoituksena on tarjota paperiteollisuuden asiakkaille ja muille kiinnostuneilla sidosryhmille yhtenäistä ympäristöinformaatiota koskien paperituotteiden koostumusta ja tuotteiden tärkeimpiä ympäristöparametrejä. Työn tärkein tavoite oli arvioida kriittisesti Paper Profilea ja verrata konseptia ISO:n, Kansainvälisen stardardisoimisliiton tuoteselosteeseen sekä löytää konseptien yhtäläisyydet ja erot. Tärkein tehtävä oli tunnistaa ne avaintekijät, joiden avulla Paper Profile -tuoteselostetta voitaisiin pitää yhtenevänä ISO/TR 14025 teknisen raportin kanssa. Lisäksi Paper Profile -tuoteselosteen mahdolliset kehittämistarpeet arvioitiin ISO-tuoteselosteen näkökulmasta. Työn toinen tavoite oli kerätä ja analysoida uuteen tuoteselosteeseen liittyvä asiakaspalaute ja verrata sitä Stora Enson tehtaiden ympäristöpäälliköiden antamiin Paper Profilea koskeviin kommentteihin. Työn tuloksena huomattiin, että Paper Profile -konsepti sellaisenaan ei ole kovin informatiivinen vaan tuoteseloste jättää monia ympäristökysymyksiä avoimeksi. Siitä huolimatta Paper Profile tarjoaa riittävän taustan eri paperituotteiden ympäristökuormitusten keskinäiselle vertailuille. Konseptin viestinnällinen näkökulma paranisi huomattavasti, jos tuoteselosteeseen lisättäisiin kolmannen osapuolen verifiointi. Lisäksi referenssiarvot kertoisivat asiakkaille paremmin esitettyjen parametrien taustoista. Stora Enson tehtaiden ympäristöpäälliköiden, samoin kuin asiakkaidenkin mielestä Paper Profile on yleisesti ottaen potentiaalinen ympäristöviestinnän työkalu, mutta silti konseptiin ehdotettiin joitakin pieniä muutoksia. Avoinna olevat metsäsertifiointikysymykset ja niiden puutteellinen tiedottaminen tuoteselosteessa puhututtivat sekä tehtaiden henkilöstöä että yritysasiakkaita.
Resumo:
Vaatimustenkäsittely on erittäin tärkeä osa-alue tehtäessä uusia ohjelmistoja. Vaatimustenkäsittely ei ole vain vaatimusmääritettydokumentin kokoamista ohjelmistoprojektin alussa vaan siihen sisältyy vaatimusten määrittely, hallinta ja todentaminen koko ohjelmiston elinkaaren ajan. Ohjelmistopalveluyrityksessä vaatimustenkäsittelyn merkitys korostuu entisestään ja tällaisella yrityksellä on oltava toimiva vaatimustenkäsittelyprosessi. Tässä työssä esitellään vaatimustenkäsittelyn teoriaa, prosesseihin liittyvää laadunvalvontaa sekä prosessien arviointi ja -kehittämismalleja. Työssä tarkastellaan kahden erityyppisen ohjelmistopalveluyrityksen vaatimustenkäsittelyä ja esitetään havaintoja prosessimalleista. Työn tuloksena esitetään johtopäätöksiä vaatimustenkäsittelystä ja siihen liittyvistä prosesseista sekä laadunvalvonnasta.