661 resultados para transforms
Resumo:
This paper proposes a new method for blindly inverting a nonlinear mapping which transforms a sum of random variables. This is the case of post-nonlinear (PNL) source separation mixtures. The importance of the method is based on the fact that it permits to decouple the estimation of the nonlinear part from the estimation of the linear one. Only the nonlinear part is inverted, without considering on the linear part. Hence the initial problem is transformed into a linear one that can then be solved with any convenient linear algorithm. The method is compared with other existing algorithms for blindly approximating nonlinear mappings. Experiments show that the proposed algorithm outperforms the results obtained with other algorithms and give a reasonably good linearized data
Resumo:
Globaalinen liiketoimintaympäristö on muutoksessa. Uudet teknologiat muuttavat toimintaympäristöä ja talouden säännöt muuttuvat nopeasti. Uusia liiketoimintamalleja tarvitaan. Tutkimuksen tavoitteena oli analysoida tieto- ja viestintäteollisuuden (ICT-teollisuus) nykytilannetta strategisesta ja kilpailuanalyyttisestä näkökulmasta, sekä luoda kuva ICT-teollisuudesta ja sen suurista pelureista Euroopassa ja USA:ssa. Tutkimus analysoi viittä suurta ICT-alan yritystä. Tutkimus oli luonteeltaan sekä kvalitatiivinen että kvantitatiivinen. Yrityksiä analysoitiin käyttäen numeerista ja laadullista materiaalia. Tutkimus perustui kirjallisuuteen, artikkeleihin, tutkimusraportteihin, yritysten internet-kotisivuihin ja vuosikertomuksiin. Tutkimuksen tuloksena voitiin löytää sekä yhtäläisyyksiä että eroavaisuuksia yritysten liiketoimintamallien ja taloudellisen menestymisen väliltä.
Resumo:
In Switzerland, the majority of students are oriented towards professional training after compulsory schooling. At this stage, one of the biggest challenges for them is to find an apprenticeship position. Matching supply and demand is a complex process that not only excludes some students from having direct access to professional training but also forces them to make early choices regarding their future sector of employment. So, how does one find an apprenticeship? And what do the students' descriptions of their search for apprenticeships reveal about the institutional determinants of social inequalities at play in the system? Based on 29 interviews conducted in 2014 with 23 apprentices and 6 recruiters in the Canton of Vaud, this article interrogates how the dimensions of educational and social trajectories combine to affect access to apprenticeships and are accentuated by recruiters using a "hidden curriculum" during the recruitment process. A hidden curriculum consists of knowledge and skills not taught by the educational institution but which appear decisive in obtaining an apprenticeship. By analysing the contrasting experiences of students in their search for an apprenticeship, we identify four types of trajectories that explain different types of school-to-apprenticeship transitions. We show how these determinants are reinforced by the "hidden curriculum" of recruitment based on the soft skills of feeling, autonomy, anticipation and reflexivity that are assessed in the context of recruitment interactions. The discussion section debates how the criteria that appear to be used to identify the "right apprentice" tend to (re)produce inequalities between students. This not only depends on their academic results but also on their social and cultural skills, their ability to anticipate their choices and, more widely, their ability to be a subject in their recruitment search. "The Subject is neither the individual, nor the self, but the work through which an individual transforms into an actor, meaning an agent able to transform his/her situation instead of reproducing it." (Touraine, 1992, p.476).
Resumo:
In this paper we focus our attention on a particle that follows a unidirectional quantum walk, an alternative version of the currently widespread discrete-time quantum walk on a line. Here the walker at each time step can either remain in place or move in a fixed direction, e.g., rightward or upward. While both formulations are essentially equivalent, the present approach leads us to consider discrete Fourier transforms, which eventually results in obtaining explicit expressions for the wave functions in terms of finite sums and allows the use of efficient algorithms based on the fast Fourier transform. The wave functions here obtained govern the probability of finding the particle at any given location but determine as well the exit-time probability of the walker from a fixed interval, which is also analyzed.
Resumo:
«Quel est l'âge de cette trace digitale?» Cette question est relativement souvent soulevée au tribunal ou lors d'investigations, lorsque la personne suspectée admet avoir laissé ses empreintes digitales sur une scène de crime mais prétend l'avoir fait à un autre moment que celui du crime et pour une raison innocente. Toutefois, aucune réponse ne peut actuellement être donnée à cette question, puisqu'aucune méthodologie n'est pour l'heure validée et acceptée par l'ensemble de la communauté forensique. Néanmoins, l'inventaire de cas américains conduit dans cette recherche a montré que les experts fournissent tout de même des témoignages au tribunal concernant l'âge de traces digitales, même si ceux-‐ci sont majoritairement basés sur des paramètres subjectifs et mal documentés. Il a été relativement aisé d'accéder à des cas américains détaillés, ce qui explique le choix de l'exemple. Toutefois, la problématique de la datation des traces digitales est rencontrée dans le monde entier, et le manque de consensus actuel dans les réponses données souligne la nécessité d'effectuer des études sur le sujet. Le but de la présente recherche est donc d'évaluer la possibilité de développer une méthode de datation objective des traces digitales. Comme les questions entourant la mise au point d'une telle procédure ne sont pas nouvelles, différentes tentatives ont déjà été décrites dans la littérature. Cette recherche les a étudiées de manière critique, et souligne que la plupart des méthodologies reportées souffrent de limitations prévenant leur utilisation pratique. Néanmoins, certaines approches basées sur l'évolution dans le temps de composés intrinsèques aux résidus papillaires se sont montrées prometteuses. Ainsi, un recensement détaillé de la littérature a été conduit afin d'identifier les composés présents dans les traces digitales et les techniques analytiques capables de les détecter. Le choix a été fait de se concentrer sur les composés sébacés détectés par chromatographie gazeuse couplée à la spectrométrie de masse (GC/MS) ou par spectroscopie infrarouge à transformée de Fourier. Des analyses GC/MS ont été menées afin de caractériser la variabilité initiale de lipides cibles au sein des traces digitales d'un même donneur (intra-‐variabilité) et entre les traces digitales de donneurs différents (inter-‐variabilité). Ainsi, plusieurs molécules ont été identifiées et quantifiées pour la première fois dans les résidus papillaires. De plus, il a été déterminé que l'intra-‐variabilité des résidus était significativement plus basse que l'inter-‐variabilité, mais que ces deux types de variabilité pouvaient être réduits en utilisant différents pré-‐ traitements statistiques s'inspirant du domaine du profilage de produits stupéfiants. Il a également été possible de proposer un modèle objectif de classification des donneurs permettant de les regrouper dans deux classes principales en se basant sur la composition initiale de leurs traces digitales. Ces classes correspondent à ce qui est actuellement appelé de manière relativement subjective des « bons » ou « mauvais » donneurs. Le potentiel d'un tel modèle est élevé dans le domaine de la recherche en traces digitales, puisqu'il permet de sélectionner des donneurs représentatifs selon les composés d'intérêt. En utilisant la GC/MS et la FTIR, une étude détaillée a été conduite sur les effets de différents facteurs d'influence sur la composition initiale et le vieillissement de molécules lipidiques au sein des traces digitales. Il a ainsi été déterminé que des modèles univariés et multivariés pouvaient être construits pour décrire le vieillissement des composés cibles (transformés en paramètres de vieillissement par pré-‐traitement), mais que certains facteurs d'influence affectaient ces modèles plus sérieusement que d'autres. En effet, le donneur, le substrat et l'application de techniques de révélation semblent empêcher la construction de modèles reproductibles. Les autres facteurs testés (moment de déposition, pression, température et illumination) influencent également les résidus et leur vieillissement, mais des modèles combinant différentes valeurs de ces facteurs ont tout de même prouvé leur robustesse dans des situations bien définies. De plus, des traces digitales-‐tests ont été analysées par GC/MS afin d'être datées en utilisant certains des modèles construits. Il s'est avéré que des estimations correctes étaient obtenues pour plus de 60 % des traces-‐tests datées, et jusqu'à 100% lorsque les conditions de stockage étaient connues. Ces résultats sont intéressants mais il est impératif de conduire des recherches supplémentaires afin d'évaluer les possibilités d'application de ces modèles dans des cas réels. Dans une perspective plus fondamentale, une étude pilote a également été effectuée sur l'utilisation de la spectroscopie infrarouge combinée à l'imagerie chimique (FTIR-‐CI) afin d'obtenir des informations quant à la composition et au vieillissement des traces digitales. Plus précisément, la capacité de cette technique à mettre en évidence le vieillissement et l'effet de certains facteurs d'influence sur de larges zones de traces digitales a été investiguée. Cette information a ensuite été comparée avec celle obtenue par les spectres FTIR simples. Il en a ainsi résulté que la FTIR-‐CI était un outil puissant, mais que son utilisation dans l'étude des résidus papillaires à des buts forensiques avait des limites. En effet, dans cette recherche, cette technique n'a pas permis d'obtenir des informations supplémentaires par rapport aux spectres FTIR traditionnels et a également montré des désavantages majeurs, à savoir de longs temps d'analyse et de traitement, particulièrement lorsque de larges zones de traces digitales doivent être couvertes. Finalement, les résultats obtenus dans ce travail ont permis la proposition et discussion d'une approche pragmatique afin d'aborder les questions de datation des traces digitales. Cette approche permet ainsi d'identifier quel type d'information le scientifique serait capable d'apporter aux enquêteurs et/ou au tribunal à l'heure actuelle. De plus, le canevas proposé décrit également les différentes étapes itératives de développement qui devraient être suivies par la recherche afin de parvenir à la validation d'une méthodologie de datation des traces digitales objective, dont les capacités et limites sont connues et documentées. -- "How old is this fingermark?" This question is relatively often raised in trials when suspects admit that they have left their fingermarks on a crime scene but allege that the contact occurred at a time different to that of the crime and for legitimate reasons. However, no answer can be given to this question so far, because no fingermark dating methodology has been validated and accepted by the whole forensic community. Nevertheless, the review of past American cases highlighted that experts actually gave/give testimonies in courts about the age of fingermarks, even if mostly based on subjective and badly documented parameters. It was relatively easy to access fully described American cases, thus explaining the origin of the given examples. However, fingermark dating issues are encountered worldwide, and the lack of consensus among the given answers highlights the necessity to conduct research on the subject. The present work thus aims at studying the possibility to develop an objective fingermark dating method. As the questions surrounding the development of dating procedures are not new, different attempts were already described in the literature. This research proposes a critical review of these attempts and highlights that most of the reported methodologies still suffer from limitations preventing their use in actual practice. Nevertheless, some approaches based on the evolution of intrinsic compounds detected in fingermark residue over time appear to be promising. Thus, an exhaustive review of the literature was conducted in order to identify the compounds available in the fingermark residue and the analytical techniques capable of analysing them. It was chosen to concentrate on sebaceous compounds analysed using gas chromatography coupled with mass spectrometry (GC/MS) or Fourier transform infrared spectroscopy (FTIR). GC/MS analyses were conducted in order to characterize the initial variability of target lipids among fresh fingermarks of the same donor (intra-‐variability) and between fingermarks of different donors (inter-‐variability). As a result, many molecules were identified and quantified for the first time in fingermark residue. Furthermore, it was determined that the intra-‐variability of the fingermark residue was significantly lower than the inter-‐variability, but that it was possible to reduce both kind of variability using different statistical pre-‐ treatments inspired from the drug profiling area. It was also possible to propose an objective donor classification model allowing the grouping of donors in two main classes based on their initial lipid composition. These classes correspond to what is relatively subjectively called "good" or "bad" donors. The potential of such a model is high for the fingermark research field, as it allows the selection of representative donors based on compounds of interest. Using GC/MS and FTIR, an in-‐depth study of the effects of different influence factors on the initial composition and aging of target lipid molecules found in fingermark residue was conducted. It was determined that univariate and multivariate models could be build to describe the aging of target compounds (transformed in aging parameters through pre-‐ processing techniques), but that some influence factors were affecting these models more than others. In fact, the donor, the substrate and the application of enhancement techniques seemed to hinder the construction of reproducible models. The other tested factors (deposition moment, pressure, temperature and illumination) also affected the residue and their aging, but models combining different values of these factors still proved to be robust. Furthermore, test-‐fingermarks were analysed with GC/MS in order to be dated using some of the generated models. It turned out that correct estimations were obtained for 60% of the dated test-‐fingermarks and until 100% when the storage conditions were known. These results are interesting but further research should be conducted to evaluate if these models could be used in uncontrolled casework conditions. In a more fundamental perspective, a pilot study was also conducted on the use of infrared spectroscopy combined with chemical imaging in order to gain information about the fingermark composition and aging. More precisely, its ability to highlight influence factors and aging effects over large areas of fingermarks was investigated. This information was then compared with that given by individual FTIR spectra. It was concluded that while FTIR-‐ CI is a powerful tool, its use to study natural fingermark residue for forensic purposes has to be carefully considered. In fact, in this study, this technique does not yield more information on residue distribution than traditional FTIR spectra and also suffers from major drawbacks, such as long analysis and processing time, particularly when large fingermark areas need to be covered. Finally, the results obtained in this research allowed the proposition and discussion of a formal and pragmatic framework to approach the fingermark dating questions. It allows identifying which type of information the scientist would be able to bring so far to investigators and/or Justice. Furthermore, this proposed framework also describes the different iterative development steps that the research should follow in order to achieve the validation of an objective fingermark dating methodology, whose capacities and limits are well known and properly documented.
Resumo:
Raportointi liittyy kiinteänä osana yrityksen jokapäiväiseen toimintaan. Raportoinnin sisältö ja muoto vaihtelevat organisaatiotasosta riippuen päivittäisen toiminnan tarkkailusta kuukausittaiseen tulosraportointiin. Raportointi voidaan toteuttaa operatiivisten järjestelmien kautta tai nykyisin entistä suositumpana vaihtoehtona on keskitetty raportointi. Uuden raportointijärjestelmän hankintaprojekti on usein koko yritystä koskeva investointi. Jos raportointijärjestelmällä on tarkoitus raportoida sekä operatiivista toimintaa että johdon tarpeita, on sen mukauduttava moneen tarkoitukseen. Aluksi on tärkeää määritellä tietotarpeet ja tavoitteet projektille unohtamatta riskien- ja projektinhallintaa sekä investointilaskelmia. Jos raportoidaan myös yrityksen ulkopuolelle, tulee ottaa huomioon mahdolliset säädökset sekä tietoturvallisuusnäkökulmat. Myös yrityksen toimintatapoja ja – prosesseja on syytä tarkastella kriittisesti ennen järjestelmähankintaa jolloin voidaan havaita uusia raportointikohteita, tai toimintatapoja voidaan uudelleen organisoida parhaan toimintatavan saavuttamiseksi. Raportointijärjestelmää hankittaessa turvaudutaan usein ulkopuoliseen ohjelmistotoimittajaan, joka integroi ja räätälöi järjestelmän yrityksen omiin tarpeisiin soveltuvaksi. Raportointijärjestelmän hankintaprojekti ei lopu käyttöönottoon vaan projektin alussa on huomioitava myös järjestelmän huomattavasti pisin elinkaari eli käyttö ja ylläpito. Raportointi-, kuten ei moni muukaan tietojärjestelmä, ole ikinä valmis sillä tarpeet ja toimintatavat muuttuvat ajan kuluessa ja käyttäjien tietoisuus lisääntyy.
Resumo:
WonderAlp est un cabinet de curiosités réalisé à partir d'une documentation provenant d'anciens ouvrages de voyages dans les Alpes. L'app présente des images de dragons, de fossiles, de cristaux, de plantes, d'animaux, de phénomènes naturels. Elle donne des clés pour retrouver et comprendre l'émerveillement devant la nature qui animait la science à l'époque de la curiosité (XVIe-XVIIIe siècles). WonderAlp is a cabinet of curiosities that transforms your iPad or Android into an Early Modern Wunderkammer. It displays objects discovered in the Alps during the early period of exploration. These are grouped under three titles : "Dragons of the Alps", "Fossils and Crystals", "Plants to Landscapes". It helps apprehend a natural world that is both rational and wonderful, scholarly and popular, unlike the compartmentalized thinking of modern life.
Resumo:
Existing research on sport organisations is imprecise in the use of the concept 'professionalisation'. Furthermore, we do not know if analytical concepts of professionalisation correspond with the understanding in practice. This study explores the perceptions of practitioners and proposes a framework to analyse professionalisation in national sport federations. Expert interviews were conducted with six key people from Swiss national sport federations and then analysed these for characteristics of professionalisation using a hermeneutic approach. The characteristics were divided into three areas: (1) changed management philosophy, (2) functional differentiation and specialisation, and (3) application of management tools. However, professionalisation is primarily perceived to be a matter of 'professional' attitude that transforms into federation culture. The practitioners disclose an ambivalent view of professionalisation, e.g. business-like culture vs. voluntarism, for-profit vs. non-profit orientation, autonomy vs. control. A framework is developed that synthesises analytical concepts and practitioners' perceptions to support future comprehensive research into causes, forms and consequences of professionalisation in national sport federations.
Resumo:
In many industrial applications, accurate and fast surface reconstruction is essential for quality control. Variation in surface finishing parameters, such as surface roughness, can reflect defects in a manufacturing process, non-optimal product operational efficiency, and reduced life expectancy of the product. This thesis considers reconstruction and analysis of high-frequency variation, that is roughness, on planar surfaces. Standard roughness measures in industry are calculated from surface topography. A fast and non-contact method to obtain surface topography is to apply photometric stereo in the estimation of surface gradients and to reconstruct the surface by integrating the gradient fields. Alternatively, visual methods, such as statistical measures, fractal dimension and distance transforms, can be used to characterize surface roughness directly from gray-scale images. In this thesis, the accuracy of distance transforms, statistical measures, and fractal dimension are evaluated in the estimation of surface roughness from gray-scale images and topographies. The results are contrasted to standard industry roughness measures. In distance transforms, the key idea is that distance values calculated along a highly varying surface are greater than distances calculated along a smoother surface. Statistical measures and fractal dimension are common surface roughness measures. In the experiments, skewness and variance of brightness distribution, fractal dimension, and distance transforms exhibited strong linear correlations to standard industry roughness measures. One of the key strengths of photometric stereo method is the acquisition of higher frequency variation of surfaces. In this thesis, the reconstruction of planar high-frequency varying surfaces is studied in the presence of imaging noise and blur. Two Wiener filterbased methods are proposed of which one is optimal in the sense of surface power spectral density given the spectral properties of the imaging noise and blur. Experiments show that the proposed methods preserve the inherent high-frequency variation in the reconstructed surfaces, whereas traditional reconstruction methods typically handle incorrect measurements by smoothing, which dampens the high-frequency variation.
Estudo comparativo sobre filtragem de sinais instrumentais usando transformadas de Fourier e Wavelet
Resumo:
A comparative study of the Fourier (FT) and the wavelet transforms (WT) for instrumental signal denoising is presented. The basic principles of wavelet theory are described in a succinct and simplified manner. For illustration, FT and WT are used to filter UV-VIS and plasma emission spectra using MATLAB software for computation. Results show that FT and WT filters are comparable when the signal does not display sharp peaks (UV-VIS spectra), but the WT yields a better filtering when the filling factor of the signal is small (plasma spectra), since it causes low peak distortion.
Resumo:
The set of initial conditions for which the pseudoclassical evolution algorithm (and minimality conservation) is verified for Hamiltonians of degrees N (N>2) is explicitly determined through a class of restrictions for the corresponding classical trajectories, and it is proved to be at most denumerable. Thus these algorithms are verified if and only if the system is quadratic except for a set of measure zero. The possibility of time-dependent a-equivalence classes is studied and its physical interpretation is presented. The implied equivalence of the pseudoclassical and Ehrenfest algorithms and their relationship with minimality conservation is discussed in detail. Also, the explicit derivation of the general unitary operator which linearly transforms minimum-uncertainty states leads to the derivation, among others, of operators with a general geometrical interpretation in phase space, such as rotations (parity, Fourier).
Resumo:
The excitation energy transfer between chlorophylls in major and minor antenna complexes of photosystem II (PSII) was investigated using quantum Fourier transforms. These transforms have an important role in the efficiency of quantum algorithms of quantum computers. The equation 2n=N was used to make the connection between excitation energy transfers using quantum Fourier transform, where n is the number of qubits required for simulation of transfers and N is the number of chlorophylls in the antenna complexes.
Resumo:
Kirjallisuusarvostelu
Resumo:
The collagen structure of isolated and in situ liver granuloma from Swiss Webster mice infected with Schistosoma mansoni was sequentially and three-dimensionally analyzed during different times of infection (early acute, acute, transitional acute-chronic, and chronic phases) by laser scanning confocal microscopy and electron scanning variable vacuum microscopy. The initial granuloma structure is characterized by vascular collagen residues and by anchorage points (or fiber radiation centers), from where collagenous fibers are angularly shed and self-assembled. During the exudative-productive stage, the self-assembly of these fibers minimizes energy and mass through continuous tension and focal compression. The curvature or angles between collagen fibers probably depends on the fibroblastic or myofibroblastic organization of stress fibers. Gradually, the loose unstable lattice of the exudative-productive stage transforms into a highly packed and stable architecture as a result of progressive compactness. The three-dimensional architecture of granulomas provides increased tissue integrity, efficient distribution of soluble compounds and a haptotactic background to the cells.
Resumo:
The objective of the present study was to perform a spectral analysis of the electrical activity of the left colon of patients with hepatosplenic schistosomiasis. Thirty patients were studied, divided into 2 groups: group A was composed of 14 patients (9 males and 5 females) with hepatosplenic schistosomiasis and group B was composed of 16 female patients without schistosomiasis mansoni. Three pairs of electrodes were implanted in the left colon at the moment of the surgical treatment. The signals of the electric activity of the colon were captured after postoperative recovery from the ileus and fed into a computer by means of a DATAQ data collection system which identified and captured frequencies between 0.02 and 10 Hz. Data were recorded, stored and analyzed using the WINDAQ 200 software. For electrical analysis, the average voltage of the electrical wave in the three electrodes of all patients, expressed as millivolts (mV), was considered, together with the maximum and minimum values, the root mean square (RMS), the skewness, and the results of the fast Fourier transforms. The average RMS of the schistosomiasis mansoni patients was 284.007 mV. During a long period of contraction, the RMS increased in a statistically significant manner from 127.455 mV during a resting period to 748.959 mV in patients with schistosomiasis mansoni. We conclude that there were no statistically significant differences in RMS values between patients with schistosomiasis mansoni and patients without the disease during the rest period or during a long period of contraction.