893 resultados para wavelet transforms


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We have constructed a forward modelling code in Matlab, capable of handling several commonly used electrical and electromagnetic methods in a 1D environment. We review the implemented electromagnetic field equations for grounded wires, frequency and transient soundings and present new solutions in the case of a non-magnetic first layer. The CR1Dmod code evaluates the Hankel transforms occurring in the field equations using either the Fast Hankel Transform based on digital filter theory, or a numerical integration scheme applied between the zeros of the Bessel function. A graphical user interface allows easy construction of 1D models and control of the parameters. Modelling results are in agreement with other authors, but the time of computation is less efficient than other available codes. Nevertheless, the CR1Dmod routine handles complex resistivities and offers solutions based on the full EM-equations as well as the quasi-static approximation. Thus, modelling of effects based on changes in the magnetic permeability and the permittivity is also possible.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Experimental results of a new controller able to support bidirectional power flow in a full-bridge rectifier with boost-like topology are obtained. The controller is computed using port Hamiltonian passivity techniques for a suitable generalized state space averaging truncation system, which transforms the control objectives, namely constant output voltage dc-bus and unity input power factor, into a regulation problem. Simulation results for the full system show the essential correctness of the simplifications introduced to obtain the controller, although some small experimental discrepancies point to several aspects that need further improvement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The inversion problem concerning the windowed Fourier transform is considered. It is shown that, out of the infinite solutions that the problem admits, the windowed Fourier transform is the "optimal" solution according to a maximum-entropy selection criterion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The deposits of two volcanic debris avalanches (VDA I and II) that occur in the upper Maronne valley, northwest sector of Cantal Volcano, France, were studied to establish their mechanisms of formation, transport and deposition. These two volcanic debris avalanches that clearly differ with regard to their structures, textures and extensions, exemplify the wide spectrum of events associated with large-scale sector collapse. VDA I is voluminous (similar to1 km(3) in the upper Maronne valley) and widespread. The deposits comprise two distinct facies: the block facies that forms the intermediate and upper part of the unit and the mixed facies that crops out essentially at the base of the unit. The block facies consists of more or less brecciated lava, block-and-ash-flow breccia and pumice-flow tuff megablocks set in breccias resulting from block disaggregation. Mixing and differential movements are almost absent in this part of the VDA. The mixed facies consists of breccias rich in fine particles that originate from block disagregation, as well as being picked up from the substratum during movement. Mixing and differential movements are predominant in this zone. Analysis of fractures on lava megablocks suggests that shear stress during the initial sliding is the principal cause of fracture. These data strongly indicate that VDA I is purely gravitational and argue for a model in which the initial sliding mass transforms into a flow due to differential in situ fragmentation caused by the shear stress. VDA II is restricted to low-topography areas. Its volume, in the studied area, is about 0.3 km(3). The deposits consist of brecciated, rounded blocks and megablocks set in a fine-grained matrix composed essentially of volcanic glass. This unit is stratified, with a massive layer that contains all the megablocks at the base and in the intermediate part, and in the upper part a normally graded layer that contains only blocks <1 m in size. The different lithologies present are totally mixed. These observations suggest that VDA II may be of the Bezymianny-type and that it underwent a flow transformation from a turbulent to a stratified flow consisting of a basal hyperconcentrated laminar body overlain by a dilute layer. (C) 2000 Elsevier Science B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Climate impact studies have indicated ecological fingerprints of recent global warming across a wide range of habitats. Whereas these studies have shown responses from various local case studies, a coherent large-scale account on temperature-driven changes of biotic communities has been lacking. Here we use 867 vegetation samples above the treeline from 60 summit sites in all major European mountain systems to show that ongoing climate change gradually transforms mountain plant communities. We provide evidence that the more cold-adapted species decline and the more warm-adapted species increase, a process described here as thermophilisation. At the scale of individual mountains this general trend may not be apparent, but at the¦larger, continental scale we observed a significantly higher abundance of thermophilic species in 2008, compared with 2001. Thermophilisation of mountain plant communities mirrors the degree of recent warming and is more pronounced in areas where the temperature increase has been higher. In view of the projected climate change the observed transformation suggests a progressive decline of cold mountain habitats and their biota.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The accumulation of aqueous pollutants is becoming a global problem. The search for suitable methods and/or combinations of water treatment processes is a task that can slow down and stop the process of water pollution. In this work, the method of wet oxidation was considered as an appropriate technique for the elimination of the impurities present in paper mill process waters. It has been shown that, when combined with traditional wastewater treatment processes, wet oxidation offers many advantages. The combination of coagulation and wet oxidation offers a new opportunity for the improvement of the quality of wastewater designated for discharge or recycling. First of all, the utilization of coagulated sludge via wet oxidation provides a conditioning process for the sludge, i.e. dewatering, which is rather difficult to carry out with untreated waste. Secondly, Fe2(SO4)3, which is employed earlier as a coagulant, transforms the conventional wet oxidation process into a catalytic one. The use of coagulation as the post-treatment for wet oxidation can offer the possibility of the brown hue that usually accompanies the partial oxidation to be reduced. As a result, the supernatant is less colored and also contains a rather low amount of Fe ions to beconsidered for recycling inside mills. The thickened part that consists of metal ions is then recycled back to the wet oxidation system. It was also observed that wet oxidation is favorable for the degradation of pitch substances (LWEs) and lignin that are present in the process waters of paper mills. Rather low operating temperatures are needed for wet oxidation in order to destruct LWEs. The oxidation in the alkaline media provides not only the faster elimination of pitch and lignin but also significantly improves the biodegradable characteristics of wastewater that contains lignin and pitch substances. During the course of the kinetic studies, a model, which can predict the enhancements of the biodegradability of wastewater, was elaborated. The model includes lumped concentrations suchas the chemical oxygen demand and biochemical oxygen demand and reflects a generalized reaction network of oxidative transformations. Later developments incorporated a new lump, the immediately available biochemical oxygen demand, which increased the fidelity of the predictions made by the model. Since changes in biodegradability occur simultaneously with the destruction of LWEs, an attempt was made to combine these two facts for modeling purposes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Technological progress has made a huge amount of data available at increasing spatial and spectral resolutions. Therefore, the compression of hyperspectral data is an area of active research. In somefields, the original quality of a hyperspectral image cannot be compromised andin these cases, lossless compression is mandatory. The main goal of this thesisis to provide improved methods for the lossless compression of hyperspectral images. Both prediction- and transform-based methods are studied. Two kinds of prediction based methods are being studied. In the first method the spectra of a hyperspectral image are first clustered and and an optimized linear predictor is calculated for each cluster. In the second prediction method linear prediction coefficients are not fixed but are recalculated for each pixel. A parallel implementation of the above-mentioned linear prediction method is also presented. Also,two transform-based methods are being presented. Vector Quantization (VQ) was used together with a new coding of the residual image. In addition we have developed a new back end for a compression method utilizing Principal Component Analysis (PCA) and Integer Wavelet Transform (IWT). The performance of the compressionmethods are compared to that of other compression methods. The results show that the proposed linear prediction methods outperform the previous methods. In addition, a novel fast exact nearest-neighbor search method is developed. The search method is used to speed up the Linde-Buzo-Gray (LBG) clustering method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An analytical approach for the interpretation of multicomponent heterogeneous adsorption or complexation isotherms in terms of multidimensional affinity spectra is presented. Fourier transform, applied to analyze the corresponding integral equation, leads to an inversion formula which allows the computation of the multicomponent affinity spectrum underlying a given competitive isotherm. Although a different mathematical methodology is used, this procedure can be seen as the extension to multicomponent systems of the classical Sips’s work devoted to monocomponent systems. Furthermore, a methodology which yields analytical expressions for the main statistical properties (mean free energies of binding and covariance matrix) of multidimensional affinity spectra is reported. Thus, the level of binding correlation between the different components can be quantified. It has to be highlighted that the reported methodology does not require the knowledge of the affinity spectrum to calculate the means, variances, and covariance of the binding energies of the different components. Nonideal competitive consistent adsorption isotherm, widely used in metal/proton competitive complexation to environmental macromolecules, and Frumkin competitive isotherms are selected to illustrate the application of the reported results. Explicit analytical expressions for the affinity spectrum as well as for the matrix correlation are obtained for the NICCA case. © 2004 American Institute of Physics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

VariScan is a software package for the analysis of DNA sequence polymorphisms at the whole genome scale. Among other features, the software:(1) can conduct many population genetic analyses; (2) incorporates a multiresolution wavelet transform-based method that allows capturing relevant information from DNA polymorphism data; and (3) it facilitates the visualization of the results in the most commonly used genome browsers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Les contes de Charles Perrault, des frères Jacob et Wilhelm Grimm, et de Hans Christian Andersen sont parmi les plus connus du genre, mais leur rapprochement masque parfois le fait qu'ils n'ont pas été écrits à la même époque, ni dans la même culture, et qu'ils comportent de nombreuses différences, notamment du point de vue thématique. Il y a cependant aussi des textes extrêmement proches dans leurs recueils : « La belle au bois dormant » raconte comme « Dornröschen » (« Rose d'épine ») l'histoire d'une princesse condamnée à dormir cent ans, puis réveillée par un prince ; « Die sechs Schwäne » (« Les six cygnes ») et « De vilde Svaner » (« Les cygnes sauvages ») relatent tous les deux l'histoire d'une princesse à la recherche de ses frères transformés en cygnes par leur belle-mère. Ces cas-là aident à appréhender les différences essentielles entre les textes, mais aussi à mieux comprendre pourquoi, en dépit des différences, ils sont reconnus comme des parangons du genre. Avec une double approche linguistique et comparatiste, cette thèse s'attache à montrer que les auteurs construisent des manières de raconter particulières. Cela concerne par exemple la place du narrateur dans le récit, la logique selon laquelle s'enchaînent les événements, ou encore l'utilisation du discours représenté. Loin de constituer de simples préférences stylistiques, ces stratégies narratives doivent être mises en relation avec un « projet discursif » : en décidant de publier un recueil de contes, comment les auteurs envisagent-ils le genre, et comment se situent-ils dans le contexte littéraire et culturel de l'époque ? Bien que Perrault, les Grimm et Andersen développent chacun un projet différent, ils exploitent tous trois un mode d'énonciation particulier, à la façon d'un discours rapporté : le conte apparaît comme un enchevêtrement de voix où un conteur relate une histoire déjà racontée auparavant, comme s'il rapportait la voix d'un autre conteur, s'inscrivant ainsi dans une tradition où l'on parle toujours à la suite de quelqu'un. L'analyse des différences entre Perrault, les Grimm et Andersen permet ainsi de construire une poétique du genre : écrire un conte, c'est convoquer une tradition populaire et, d'un point de vue énonciatif, c'est inscrire sa voix dans le concert des voix généré par le conte.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper proposes a new method for blindly inverting a nonlinear mapping which transforms a sum of random variables. This is the case of post-nonlinear (PNL) source separation mixtures. The importance of the method is based on the fact that it permits to decouple the estimation of the nonlinear part from the estimation of the linear one. Only the nonlinear part is inverted, without considering on the linear part. Hence the initial problem is transformed into a linear one that can then be solved with any convenient linear algorithm. The method is compared with other existing algorithms for blindly approximating nonlinear mappings. Experiments show that the proposed algorithm outperforms the results obtained with other algorithms and give a reasonably good linearized data

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Globaalinen liiketoimintaympäristö on muutoksessa. Uudet teknologiat muuttavat toimintaympäristöä ja talouden säännöt muuttuvat nopeasti. Uusia liiketoimintamalleja tarvitaan. Tutkimuksen tavoitteena oli analysoida tieto- ja viestintäteollisuuden (ICT-teollisuus) nykytilannetta strategisesta ja kilpailuanalyyttisestä näkökulmasta, sekä luoda kuva ICT-teollisuudesta ja sen suurista pelureista Euroopassa ja USA:ssa. Tutkimus analysoi viittä suurta ICT-alan yritystä. Tutkimus oli luonteeltaan sekä kvalitatiivinen että kvantitatiivinen. Yrityksiä analysoitiin käyttäen numeerista ja laadullista materiaalia. Tutkimus perustui kirjallisuuteen, artikkeleihin, tutkimusraportteihin, yritysten internet-kotisivuihin ja vuosikertomuksiin. Tutkimuksen tuloksena voitiin löytää sekä yhtäläisyyksiä että eroavaisuuksia yritysten liiketoimintamallien ja taloudellisen menestymisen väliltä.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In Switzerland, the majority of students are oriented towards professional training after compulsory schooling. At this stage, one of the biggest challenges for them is to find an apprenticeship position. Matching supply and demand is a complex process that not only excludes some students from having direct access to professional training but also forces them to make early choices regarding their future sector of employment. So, how does one find an apprenticeship? And what do the students' descriptions of their search for apprenticeships reveal about the institutional determinants of social inequalities at play in the system? Based on 29 interviews conducted in 2014 with 23 apprentices and 6 recruiters in the Canton of Vaud, this article interrogates how the dimensions of educational and social trajectories combine to affect access to apprenticeships and are accentuated by recruiters using a "hidden curriculum" during the recruitment process. A hidden curriculum consists of knowledge and skills not taught by the educational institution but which appear decisive in obtaining an apprenticeship. By analysing the contrasting experiences of students in their search for an apprenticeship, we identify four types of trajectories that explain different types of school-to-apprenticeship transitions. We show how these determinants are reinforced by the "hidden curriculum" of recruitment based on the soft skills of feeling, autonomy, anticipation and reflexivity that are assessed in the context of recruitment interactions. The discussion section debates how the criteria that appear to be used to identify the "right apprentice" tend to (re)produce inequalities between students. This not only depends on their academic results but also on their social and cultural skills, their ability to anticipate their choices and, more widely, their ability to be a subject in their recruitment search. "The Subject is neither the individual, nor the self, but the work through which an individual transforms into an actor, meaning an agent able to transform his/her situation instead of reproducing it." (Touraine, 1992, p.476).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we focus our attention on a particle that follows a unidirectional quantum walk, an alternative version of the currently widespread discrete-time quantum walk on a line. Here the walker at each time step can either remain in place or move in a fixed direction, e.g., rightward or upward. While both formulations are essentially equivalent, the present approach leads us to consider discrete Fourier transforms, which eventually results in obtaining explicit expressions for the wave functions in terms of finite sums and allows the use of efficient algorithms based on the fast Fourier transform. The wave functions here obtained govern the probability of finding the particle at any given location but determine as well the exit-time probability of the walker from a fixed interval, which is also analyzed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

«Quel est l'âge de cette trace digitale?» Cette question est relativement souvent soulevée au tribunal ou lors d'investigations, lorsque la personne suspectée admet avoir laissé ses empreintes digitales sur une scène de crime mais prétend l'avoir fait à un autre moment que celui du crime et pour une raison innocente. Toutefois, aucune réponse ne peut actuellement être donnée à cette question, puisqu'aucune méthodologie n'est pour l'heure validée et acceptée par l'ensemble de la communauté forensique. Néanmoins, l'inventaire de cas américains conduit dans cette recherche a montré que les experts fournissent tout de même des témoignages au tribunal concernant l'âge de traces digitales, même si ceux-­‐ci sont majoritairement basés sur des paramètres subjectifs et mal documentés. Il a été relativement aisé d'accéder à des cas américains détaillés, ce qui explique le choix de l'exemple. Toutefois, la problématique de la datation des traces digitales est rencontrée dans le monde entier, et le manque de consensus actuel dans les réponses données souligne la nécessité d'effectuer des études sur le sujet. Le but de la présente recherche est donc d'évaluer la possibilité de développer une méthode de datation objective des traces digitales. Comme les questions entourant la mise au point d'une telle procédure ne sont pas nouvelles, différentes tentatives ont déjà été décrites dans la littérature. Cette recherche les a étudiées de manière critique, et souligne que la plupart des méthodologies reportées souffrent de limitations prévenant leur utilisation pratique. Néanmoins, certaines approches basées sur l'évolution dans le temps de composés intrinsèques aux résidus papillaires se sont montrées prometteuses. Ainsi, un recensement détaillé de la littérature a été conduit afin d'identifier les composés présents dans les traces digitales et les techniques analytiques capables de les détecter. Le choix a été fait de se concentrer sur les composés sébacés détectés par chromatographie gazeuse couplée à la spectrométrie de masse (GC/MS) ou par spectroscopie infrarouge à transformée de Fourier. Des analyses GC/MS ont été menées afin de caractériser la variabilité initiale de lipides cibles au sein des traces digitales d'un même donneur (intra-­‐variabilité) et entre les traces digitales de donneurs différents (inter-­‐variabilité). Ainsi, plusieurs molécules ont été identifiées et quantifiées pour la première fois dans les résidus papillaires. De plus, il a été déterminé que l'intra-­‐variabilité des résidus était significativement plus basse que l'inter-­‐variabilité, mais que ces deux types de variabilité pouvaient être réduits en utilisant différents pré-­‐ traitements statistiques s'inspirant du domaine du profilage de produits stupéfiants. Il a également été possible de proposer un modèle objectif de classification des donneurs permettant de les regrouper dans deux classes principales en se basant sur la composition initiale de leurs traces digitales. Ces classes correspondent à ce qui est actuellement appelé de manière relativement subjective des « bons » ou « mauvais » donneurs. Le potentiel d'un tel modèle est élevé dans le domaine de la recherche en traces digitales, puisqu'il permet de sélectionner des donneurs représentatifs selon les composés d'intérêt. En utilisant la GC/MS et la FTIR, une étude détaillée a été conduite sur les effets de différents facteurs d'influence sur la composition initiale et le vieillissement de molécules lipidiques au sein des traces digitales. Il a ainsi été déterminé que des modèles univariés et multivariés pouvaient être construits pour décrire le vieillissement des composés cibles (transformés en paramètres de vieillissement par pré-­‐traitement), mais que certains facteurs d'influence affectaient ces modèles plus sérieusement que d'autres. En effet, le donneur, le substrat et l'application de techniques de révélation semblent empêcher la construction de modèles reproductibles. Les autres facteurs testés (moment de déposition, pression, température et illumination) influencent également les résidus et leur vieillissement, mais des modèles combinant différentes valeurs de ces facteurs ont tout de même prouvé leur robustesse dans des situations bien définies. De plus, des traces digitales-­‐tests ont été analysées par GC/MS afin d'être datées en utilisant certains des modèles construits. Il s'est avéré que des estimations correctes étaient obtenues pour plus de 60 % des traces-­‐tests datées, et jusqu'à 100% lorsque les conditions de stockage étaient connues. Ces résultats sont intéressants mais il est impératif de conduire des recherches supplémentaires afin d'évaluer les possibilités d'application de ces modèles dans des cas réels. Dans une perspective plus fondamentale, une étude pilote a également été effectuée sur l'utilisation de la spectroscopie infrarouge combinée à l'imagerie chimique (FTIR-­‐CI) afin d'obtenir des informations quant à la composition et au vieillissement des traces digitales. Plus précisément, la capacité de cette technique à mettre en évidence le vieillissement et l'effet de certains facteurs d'influence sur de larges zones de traces digitales a été investiguée. Cette information a ensuite été comparée avec celle obtenue par les spectres FTIR simples. Il en a ainsi résulté que la FTIR-­‐CI était un outil puissant, mais que son utilisation dans l'étude des résidus papillaires à des buts forensiques avait des limites. En effet, dans cette recherche, cette technique n'a pas permis d'obtenir des informations supplémentaires par rapport aux spectres FTIR traditionnels et a également montré des désavantages majeurs, à savoir de longs temps d'analyse et de traitement, particulièrement lorsque de larges zones de traces digitales doivent être couvertes. Finalement, les résultats obtenus dans ce travail ont permis la proposition et discussion d'une approche pragmatique afin d'aborder les questions de datation des traces digitales. Cette approche permet ainsi d'identifier quel type d'information le scientifique serait capable d'apporter aux enquêteurs et/ou au tribunal à l'heure actuelle. De plus, le canevas proposé décrit également les différentes étapes itératives de développement qui devraient être suivies par la recherche afin de parvenir à la validation d'une méthodologie de datation des traces digitales objective, dont les capacités et limites sont connues et documentées. -- "How old is this fingermark?" This question is relatively often raised in trials when suspects admit that they have left their fingermarks on a crime scene but allege that the contact occurred at a time different to that of the crime and for legitimate reasons. However, no answer can be given to this question so far, because no fingermark dating methodology has been validated and accepted by the whole forensic community. Nevertheless, the review of past American cases highlighted that experts actually gave/give testimonies in courts about the age of fingermarks, even if mostly based on subjective and badly documented parameters. It was relatively easy to access fully described American cases, thus explaining the origin of the given examples. However, fingermark dating issues are encountered worldwide, and the lack of consensus among the given answers highlights the necessity to conduct research on the subject. The present work thus aims at studying the possibility to develop an objective fingermark dating method. As the questions surrounding the development of dating procedures are not new, different attempts were already described in the literature. This research proposes a critical review of these attempts and highlights that most of the reported methodologies still suffer from limitations preventing their use in actual practice. Nevertheless, some approaches based on the evolution of intrinsic compounds detected in fingermark residue over time appear to be promising. Thus, an exhaustive review of the literature was conducted in order to identify the compounds available in the fingermark residue and the analytical techniques capable of analysing them. It was chosen to concentrate on sebaceous compounds analysed using gas chromatography coupled with mass spectrometry (GC/MS) or Fourier transform infrared spectroscopy (FTIR). GC/MS analyses were conducted in order to characterize the initial variability of target lipids among fresh fingermarks of the same donor (intra-­‐variability) and between fingermarks of different donors (inter-­‐variability). As a result, many molecules were identified and quantified for the first time in fingermark residue. Furthermore, it was determined that the intra-­‐variability of the fingermark residue was significantly lower than the inter-­‐variability, but that it was possible to reduce both kind of variability using different statistical pre-­‐ treatments inspired from the drug profiling area. It was also possible to propose an objective donor classification model allowing the grouping of donors in two main classes based on their initial lipid composition. These classes correspond to what is relatively subjectively called "good" or "bad" donors. The potential of such a model is high for the fingermark research field, as it allows the selection of representative donors based on compounds of interest. Using GC/MS and FTIR, an in-­‐depth study of the effects of different influence factors on the initial composition and aging of target lipid molecules found in fingermark residue was conducted. It was determined that univariate and multivariate models could be build to describe the aging of target compounds (transformed in aging parameters through pre-­‐ processing techniques), but that some influence factors were affecting these models more than others. In fact, the donor, the substrate and the application of enhancement techniques seemed to hinder the construction of reproducible models. The other tested factors (deposition moment, pressure, temperature and illumination) also affected the residue and their aging, but models combining different values of these factors still proved to be robust. Furthermore, test-­‐fingermarks were analysed with GC/MS in order to be dated using some of the generated models. It turned out that correct estimations were obtained for 60% of the dated test-­‐fingermarks and until 100% when the storage conditions were known. These results are interesting but further research should be conducted to evaluate if these models could be used in uncontrolled casework conditions. In a more fundamental perspective, a pilot study was also conducted on the use of infrared spectroscopy combined with chemical imaging in order to gain information about the fingermark composition and aging. More precisely, its ability to highlight influence factors and aging effects over large areas of fingermarks was investigated. This information was then compared with that given by individual FTIR spectra. It was concluded that while FTIR-­‐ CI is a powerful tool, its use to study natural fingermark residue for forensic purposes has to be carefully considered. In fact, in this study, this technique does not yield more information on residue distribution than traditional FTIR spectra and also suffers from major drawbacks, such as long analysis and processing time, particularly when large fingermark areas need to be covered. Finally, the results obtained in this research allowed the proposition and discussion of a formal and pragmatic framework to approach the fingermark dating questions. It allows identifying which type of information the scientist would be able to bring so far to investigators and/or Justice. Furthermore, this proposed framework also describes the different iterative development steps that the research should follow in order to achieve the validation of an objective fingermark dating methodology, whose capacities and limits are well known and properly documented.