936 resultados para Best available techniques
Resumo:
The implementation of new surgical techniques offers chances but carries risks. Usually, several years pass before a critical appraisal and a balanced opinion of a new treatment method are available and rely on the evidence from the literature and expert's opinion. The frozen elephant trunk (FET) technique has been increasingly used to treat complex pathologies of the aortic arch and the descending aorta, but there still is an ongoing discussion within the surgical community about the optimal indications. This paper represents a common effort of the Vascular Domain of EACTS together with several surgeons with particular expertise in aortic surgery, and summarizes the current knowledge and the state of the art about the FET technique. The majority of the information about the FET technique has been extracted from 97 focused publications already available in the PubMed database (cohort studies, case reports, reviews, small series, meta-analyses and best evidence topics) published in English.
Resumo:
Intrabony periodontal defects are a frequent complication of periodontitis and, if left untreated, may negatively affect long-term tooth prognosis. The optimal outcome of treatment in intrabony defects is considered to be the absence of bleeding on probing, the presence of shallow pockets associated with periodontal regeneration (i.e. formation of new root cementum with functionally orientated inserting periodontal ligament fibers connected to new alveolar bone) and no soft-tissue recession. A plethora of different surgical techniques, often including implantation of various types of bone graft and/or bone substitutes, root surface demineralization, guided tissue regeneration, growth and differentiation factors, enamel matrix proteins or various combinations thereof, have been employed to achieve periodontal regeneration. Despite positive observations in animal models and successful outcomes reported for many of the available regenerative techniques and materials in patients, including histologic reports, robust information on the degree to which reported clinical improvements reflect true periodontal regeneration does not exist. Thus, the aim of this review was to summarize, in a systematic manner, the available histologic evidence on the effect of reconstructive periodontal surgery using various types of biomaterials to enhance periodontal wound healing/regeneration in human intrabony defects. In addition, the inherent problems associated with performing human histologic studies and in interpreting the results, as well as certain ethical considerations, are discussed. The results of the present systematic review indicate that periodontal regeneration in human intrabony defects can be achieved to a variable extent using a range of methods and materials. Periodontal regeneration has been observed following the use of a variety of bone grafts and substitutes, guided tissue regeneration, biological factors and combinations thereof. Combination approaches appear to provide the best outcomes, whilst implantation of alloplastic material alone demonstrated limited, to no, periodontal regeneration.
Resumo:
Index tracking has become one of the most common strategies in asset management. The index-tracking problem consists of constructing a portfolio that replicates the future performance of an index by including only a subset of the index constituents in the portfolio. Finding the most representative subset is challenging when the number of stocks in the index is large. We introduce a new three-stage approach that at first identifies promising subsets by employing data-mining techniques, then determines the stock weights in the subsets using mixed-binary linear programming, and finally evaluates the subsets based on cross validation. The best subset is returned as the tracking portfolio. Our approach outperforms state-of-the-art methods in terms of out-of-sample performance and running times.
Resumo:
The history of cerebral aneurysm surgery owes a great tribute to the tenacity of pioneering neurosurgeons who designed and developed the clips used to close the aneurysms neck. However, until the beginning of the past century, surgery of complex and challenging aneurysms was impossible due to the lack of surgical microscope and commercially available sophisticated clips. The modern era of the spring clips began in the second half of last century. Until then, only malleable metal clips and other non-metallic materials were available for intracranial aneurysms. Indeed, the earliest clips were hazardous and difficult to handle. Several neurosurgeons put their effort in developing new clip models, based on their personal experience in the treatment of cerebral aneurysms. Finally, the introduction of the surgical microscope, together with the availability of more sophisticated clips, has allowed the treatment of complex and challenging aneurysms. However, today none of the new instruments or tools for surgical therapy of aneurysms could be used safely and effectively without keeping in mind the lessons on innovative surgical techniques provided by great neurovascular surgeons. Thanks to their legacy, we can now treat many types of aneurysms that had always been considered inoperable. In this article, we review the basic principles of surgical clipping and illustrate some more advanced techniques to be used for complex aneurysms.
Resumo:
BACKGROUND Neuronavigation is an essential tool in cranial neurosurgery. Despite continuing improvements in the technologies used for neuronavigation, certain events can lead to unacceptable mismatches. To provide the best possible outcome for the patients, surgeons need to do everything possible to reduce mismatches. METHODS AND RESULTS Some simple techniques can greatly improve neuronavigation accuracy and patient safety. We describe two simple methods that were developed or refined in the Department of Neurosurgery at Inselspital, Bern, Switzerland: the transdermal navigation landmark and use of bone screws for co-registration. CONCLUSIONS Both techniques are easy to use, do not require expensive additional instruments, and are helpful in procedures involving neuronavigation.
Resumo:
BACKGROUND Guidelines on the clinical management of non-metastatic castrate-resistant prostate cancer (nmCRPC) generally focus on the need to continue androgen deprivation therapy and enrol patients into clinical trials of investigational agents. This guidance reflects the lack of clinical trial data with established agents in the nmCRPC patient population and the need for trials of new agents. AIM To review the evidence base and consider ways of improving the management of nmCRPC. CONCLUSION Upon the development of castrate resistance, it is essential to rule out the presence of metastases or micrometastases by optimising the use of bone scans and possibly newer procedures and techniques. When nmCRPC is established, management decisions should be individualised according to risk, but risk stratification in this diverse population is poorly defined. Currently, prostate-specific antigen (PSA) levels and PSA doubling time remain the best method of assessing the risk of progression and response to treatment in nmCRPC. However, optimising imaging protocols can also help assess the changing metastatic burden in patients with CRPC. Clinical trials of novel agents in nmCRPC are limited and have problems with enrolment, and therefore, improved risk stratification and imaging may be crucial to the improved management. The statements presented in this paper, reflecting the views of the authors, provide a discussion of the most recent evidence in nmCRPC and provide some advice on how to ensure these patients receive the best management available. However, there is an urgent need for more data on the management of nmCRPC.
Resumo:
The current literature available on bladder cancer symptom management from the perspective of the patients themselves is limited. There is also limited psychosocial research specific to bladder cancer patients and no previous studies have developed and validated measures for bladder cancer patients’ symptom management self-efficacy. The purpose of this study was to investigate non-muscle invasive bladder cancer patients’ health related quality of life through two main study objectives: (1) to describe the treatment related symptoms, reported effectiveness of symptom-management techniques, and the advice a sample of non-muscle invasive bladder cancer patients would convey to physicians and future patients; and (2) to evaluate Lepore’s symptom management self-efficacy measure on a sample of non-muscle invasive bladder cancer patients. Methods. A total of twelve (n=12) non-muscle invasive bladder cancer patients participated in an in-depth interview and a sample of 46 (n=4) non-muscle invasive bladder cancer patients participated in the symptom-management self-efficacy survey. Results. A total of five symptom categories emerged for the participants’ 59 reported symptoms. Four symptom management categories emerged out of the 71 reported techniques. A total of 62% of the participants’ treatment related symptom-management techniques were reported as effective in managing their treatment-related symptoms. Five advice categories emerged out of the in-depth interviews: service delivery; medical advice; physician-patient communication; encouragement; and no advice. An exploratory factor analysis indicated a single-factor structure for the total population and a multiple factor structure for three subgroups: all males, married males, and all married participants. Conclusion. These findings can inform physicians and patients of effective symptom-management techniques thus improving patients’ health-related quality of life. The advice these patients’ impart can improve service-delivery and patient education.^
Resumo:
The research project is an extension of a series of administrative science and health care research projects evaluating the influence of external context, organizational strategy, and organizational structure upon organizational success or performance. The research will rely on the assumption that there is not one single best approach to the management of organizations (the contingency theory). As organizational effectiveness is dependent on an appropriate mix of factors, organizations may be equally effective based on differing combinations of factors. The external context of the organization is expected to influence internal organizational strategy and structure and in turn the internal measures affect performance (discriminant theory). The research considers the relationship of external context and organization performance.^ The unit of study for the research will be the health maintenance organization (HMO); an organization the accepts in exchange for a fixed, advance capitation payment, contractual responsibility to assure the delivery of a stated range of health sevices to a voluntary enrolled population. With the current Federal resurgence of interest in the Health Maintenance Organization (HMO) as a major component in the health care system, attention must be directed at maximizing development of HMOs from the limited resources available. Increased skills are needed in both Federal and private evaluation of HMO feasibility in order to prevent resource investment and in projects that will fail while concurrently identifying potentially successful projects that will not be considered using current standards.^ The research considers 192 factors measuring contextual milieu (social, educational, economic, legal, demographic, health and technological factors). Through intercorrelation and principle components data reduction techniques this was reduced to 12 variables. Two measures of HMO performance were identified, they are (1) HMO status (operational or defunct), and (2) a principle components factor score considering eight measures of performance. The relationship between HMO context and performance was analysed using correlation and stepwise multiple regression methods. In each case it has been concluded that the external contextual variables are not predictive of success or failure of study Health Maintenance Organizations. This suggests that performance of an HMO may rely on internal organizational factors. These findings have policy implications as contextual measures are used as a major determinant in HMO feasibility analysis, and as a factor in the allocation of limited Federal funds. ^
Resumo:
High-resolution, small-bore PET systems suffer from a tradeoff between system sensitivity, and image quality degradation. In these systems long crystals allow mispositioning of the line of response due to parallax error and this mispositioning causes resolution blurring, but long crystals are necessary for high system sensitivity. One means to allow long crystals without introducing parallax errors is to determine the depth of interaction (DOI) of the gamma ray interaction within the detector module. While DOI has been investigated previously, newly available solid state photomultipliers (SSPMs) well-suited to PET applications and allow new modules for investigation. Depth of interaction in full modules is a relatively new field, and so even if high performance DOI capable modules were available, the appropriate means to characterize and calibrate the modules are not. This work presents an investigation of DOI capable arrays and techniques for characterizing and calibrating those modules. The methods introduced here accurately and reliably characterize and calibrate energy, timing, and event interaction positioning. Additionally presented is a characterization of the spatial resolution of DOI capable modules and a measurement of DOI effects for different angles between detector modules. These arrays have been built into a prototype PET system that delivers better than 2.0 mm resolution with a single-sided-stopping-power in excess of 95% for 511 keV g's. The noise properties of SSPMs scale with the active area of the detector face, and so the best signal-to-noise ratio is possible with parallel readout of each SSPM photodetector pixel rather than multiplexing signals together. This work additionally investigates several algorithms for improving timing performance using timing information from multiple SSPM pixels when light is distributed among several photodetectors.
Resumo:
Essential biological processes are governed by organized, dynamic interactions between multiple biomolecular systems. Complexes are thus formed to enable the biological function and get dissembled as the process is completed. Examples of such processes include the translation of the messenger RNA into protein by the ribosome, the folding of proteins by chaperonins or the entry of viruses in host cells. Understanding these fundamental processes by characterizing the molecular mechanisms that enable then, would allow the (better) design of therapies and drugs. Such molecular mechanisms may be revealed trough the structural elucidation of the biomolecular assemblies at the core of these processes. Various experimental techniques may be applied to investigate the molecular architecture of biomolecular assemblies. High-resolution techniques, such as X-ray crystallography, may solve the atomic structure of the system, but are typically constrained to biomolecules of reduced flexibility and dimensions. In particular, X-ray crystallography requires the sample to form a three dimensional (3D) crystal lattice which is technically di‑cult, if not impossible, to obtain, especially for large, dynamic systems. Often these techniques solve the structure of the different constituent components within the assembly, but encounter difficulties when investigating the entire system. On the other hand, imaging techniques, such as cryo-electron microscopy (cryo-EM), are able to depict large systems in near-native environment, without requiring the formation of crystals. The structures solved by cryo-EM cover a wide range of resolutions, from very low level of detail where only the overall shape of the system is visible, to high-resolution that approach, but not yet reach, atomic level of detail. In this dissertation, several modeling methods are introduced to either integrate cryo-EM datasets with structural data from X-ray crystallography, or to directly interpret the cryo-EM reconstruction. Such computational techniques were developed with the goal of creating an atomic model for the cryo-EM data. The low-resolution reconstructions lack the level of detail to permit a direct atomic interpretation, i.e. one cannot reliably locate the atoms or amino-acid residues within the structure obtained by cryo-EM. Thereby one needs to consider additional information, for example, structural data from other sources such as X-ray crystallography, in order to enable such a high-resolution interpretation. Modeling techniques are thus developed to integrate the structural data from the different biophysical sources, examples including the work described in the manuscript I and II of this dissertation. At intermediate and high-resolution, cryo-EM reconstructions depict consistent 3D folds such as tubular features which in general correspond to alpha-helices. Such features can be annotated and later on used to build the atomic model of the system, see manuscript III as alternative. Three manuscripts are presented as part of the PhD dissertation, each introducing a computational technique that facilitates the interpretation of cryo-EM reconstructions. The first manuscript is an application paper that describes a heuristics to generate the atomic model for the protein envelope of the Rift Valley fever virus. The second manuscript introduces the evolutionary tabu search strategies to enable the integration of multiple component atomic structures with the cryo-EM map of their assembly. Finally, the third manuscript develops further the latter technique and apply it to annotate consistent 3D patterns in intermediate-resolution cryo-EM reconstructions. The first manuscript, titled An assembly model for Rift Valley fever virus, was submitted for publication in the Journal of Molecular Biology. The cryo-EM structure of the Rift Valley fever virus was previously solved at 27Å-resolution by Dr. Freiberg and collaborators. Such reconstruction shows the overall shape of the virus envelope, yet the reduced level of detail prevents the direct atomic interpretation. High-resolution structures are not yet available for the entire virus nor for the two different component glycoproteins that form its envelope. However, homology models may be generated for these glycoproteins based on similar structures that are available at atomic resolutions. The manuscript presents the steps required to identify an atomic model of the entire virus envelope, based on the low-resolution cryo-EM map of the envelope and the homology models of the two glycoproteins. Starting with the results of the exhaustive search to place the two glycoproteins, the model is built iterative by running multiple multi-body refinements to hierarchically generate models for the different regions of the envelope. The generated atomic model is supported by prior knowledge regarding virus biology and contains valuable information about the molecular architecture of the system. It provides the basis for further investigations seeking to reveal different processes in which the virus is involved such as assembly or fusion. The second manuscript was recently published in the of Journal of Structural Biology (doi:10.1016/j.jsb.2009.12.028) under the title Evolutionary tabu search strategies for the simultaneous registration of multiple atomic structures in cryo-EM reconstructions. This manuscript introduces the evolutionary tabu search strategies applied to enable a multi-body registration. This technique is a hybrid approach that combines a genetic algorithm with a tabu search strategy to promote the proper exploration of the high-dimensional search space. Similar to the Rift Valley fever virus, it is common that the structure of a large multi-component assembly is available at low-resolution from cryo-EM, while high-resolution structures are solved for the different components but lack for the entire system. Evolutionary tabu search strategies enable the building of an atomic model for the entire system by considering simultaneously the different components. Such registration indirectly introduces spatial constrains as all components need to be placed within the assembly, enabling the proper docked in the low-resolution map of the entire assembly. Along with the method description, the manuscript covers the validation, presenting the benefit of the technique in both synthetic and experimental test cases. Such approach successfully docked multiple components up to resolutions of 40Å. The third manuscript is entitled Evolutionary Bidirectional Expansion for the Annotation of Alpha Helices in Electron Cryo-Microscopy Reconstructions and was submitted for publication in the Journal of Structural Biology. The modeling approach described in this manuscript applies the evolutionary tabu search strategies in combination with the bidirectional expansion to annotate secondary structure elements in intermediate resolution cryo-EM reconstructions. In particular, secondary structure elements such as alpha helices show consistent patterns in cryo-EM data, and are visible as rod-like patterns of high density. The evolutionary tabu search strategy is applied to identify the placement of the different alpha helices, while the bidirectional expansion characterizes their length and curvature. The manuscript presents the validation of the approach at resolutions ranging between 6 and 14Å, a level of detail where alpha helices are visible. Up to resolution of 12 Å, the method measures sensitivities between 70-100% as estimated in experimental test cases, i.e. 70-100% of the alpha-helices were correctly predicted in an automatic manner in the experimental data. The three manuscripts presented in this PhD dissertation cover different computation methods for the integration and interpretation of cryo-EM reconstructions. The methods were developed in the molecular modeling software Sculptor (http://sculptor.biomachina.org) and are available for the scientific community interested in the multi-resolution modeling of cryo-EM data. The work spans a wide range of resolution covering multi-body refinement and registration at low-resolution along with annotation of consistent patterns at high-resolution. Such methods are essential for the modeling of cryo-EM data, and may be applied in other fields where similar spatial problems are encountered, such as medical imaging.
Resumo:
Endolithic bioerosion is difficult to analyse and to describe, and it usually requires damaging of the sample material. Sponge erosion (Entobia) may be one of the most difficult to evaluate as it is simultaneously macroscopically inhomogeneous and microstructurally intricate. We studied the bioerosion traces of the two Australian sponges Cliona celata Grant, 1826 (sensu Schönberg 2000) and Cliona orientalis Thiele, 1900 with a newly available radiographic technology: high resolution X-ray micro-computed tomography (MCT). MCT allows non-destructive visualisation of live and dead structures in three dimensions and was compared to traditional microscopic methods. MCT and microscopy showed that C. celata bioerosion was more intense in the centre and branched out in the periphery. In contrast, C. orientalis produced a dense, even trace meshwork and caused an overall more intense erosion pattern than C. celata. Extended pioneering filaments were not usually found at the margins of the studied sponge erosion, but branches ended abruptly or tapered to points. Results obtained with MCT were similar in quality to observations from transparent optical spar under the dissecting microscope. Microstructures could not be resolved as well as with e.g. scanning electron microscopy (SEM). Even though sponge scars and sponge chips were easily recognisable on maximum magnification MCT images, they lacked the detail that is available from SEM. Other drawbacks of MCT involve high costs and presently limited access. Even though MCT cannot presently replace traditional techniques such as corrosion casts viewed by SEM, we obtained valuable information. Especially for the possibility to measure endolithic pore volumes, we regard MCT as a very promising tool that will continue to be optimised. A combination of different methods will produce the best results in the study of Entobia.
Resumo:
In 2005, the International Ocean Colour Coordinating Group (IOCCG) convened a working group to examine the state of the art in ocean colour data merging, which showed that the research techniques had matured sufficiently for creating long multi-sensor datasets (IOCCG, 2007). As a result, ESA initiated and funded the DUE GlobColour project (http://www.globcolour.info/) to develop a satellite based ocean colour data set to support global carbon-cycle research. It aims to satisfy the scientific requirement for a long (10+ year) time-series of consistently calibrated global ocean colour information with the best possible spatial coverage. This has been achieved by merging data from the three most capable sensors: SeaWiFS on GeoEye's Orbview-2 mission, MODIS on NASA's Aqua mission and MERIS on ESA's ENVISAT mission. In setting up the GlobColour project, three user organisations were invited to help. Their roles are to specify the detailed user requirements, act as a channel to the broader end user community and to provide feedback and assessment of the results. The International Ocean Carbon Coordination Project (IOCCP) based at UNESCO in Paris provides direct access to the carbon cycle modelling community's requirements and to the modellers themselves who will use the final products. The UK Met Office's National Centre for Ocean Forecasting (NCOF) in Exeter, UK, provides an understanding of the requirements of oceanography users, and the IOCCG bring their understanding of the global user needs and valuable advice on best practice within the ocean colour science community. The three year project kicked-off in November 2005 under the leadership of ACRI-ST (France). The first year was a feasibility demonstration phase that was successfully concluded at a user consultation workshop organised by the Laboratoire d'Océanographie de Villefranche, France, in December 2006. Error statistics and inter-sensor biases were quantified by comparison with insitu measurements from moored optical buoys and ship based campaigns, and used as an input to the merging. The second year was dedicated to the production of the time series. In total, more than 25 Tb of input (level 2) data have been ingested and 14 Tb of intermediate and output products created, with 4 Tb of data distributed to the user community. Quality control (QC) is provided through the Diagnostic Data Sets (DDS), which are extracted sub-areas covering locations of in-situ data collection or interesting oceanographic phenomena. This Full Product Set (FPS) covers global daily merged ocean colour products in the time period 1997-2006 and is also freely available for use by the worldwide science community at http://www.globcolour.info/data_access_full_prod_set.html. The GlobColour service distributes global daily, 8-day and monthly data sets at 4.6 km resolution for, chlorophyll-a concentration, normalised water-leaving radiances (412, 443, 490, 510, 531, 555 and 620 nm, 670, 681 and 709 nm), diffuse attenuation coefficient, coloured dissolved and detrital organic materials, total suspended matter or particulate backscattering coefficient, turbidity index, cloud fraction and quality indicators. Error statistics from the initial sensor characterisation are used as an input to the merging methods and propagate through the merging process to provide error estimates for the output merged products. These error estimates are a key component of GlobColour as they are invaluable to the users; particularly the modellers who need them in order to assimilate the ocean colour data into ocean simulations. An intensive phase of validation has been undertaken to assess the quality of the data set. In addition, inter-comparisons between the different merged datasets will help in further refining the techniques used. Both the final products and the quality assessment were presented at a second user consultation in Oslo on 20-22 November 2007 organised by the Norwegian Institute for Water Research (NIVA); presentations are available on the GlobColour WWW site. On request of the ESA Technical Officer for the GlobColour project, the FPS data set was mirrored in the PANGAEA data library.
Resumo:
Nowadays computing platforms consist of a very large number of components that require to be supplied with diferent voltage levels and power requirements. Even a very small platform, like a handheld computer, may contain more than twenty diferent loads and voltage regulators. The power delivery designers of these systems are required to provide, in a very short time, the right power architecture that optimizes the performance, meets electrical specifications plus cost and size targets. The appropriate selection of the architecture and converters directly defines the performance of a given solution. Therefore, the designer needs to be able to evaluate a significant number of options in order to know with good certainty whether the selected solutions meet the size, energy eficiency and cost targets. The design dificulties of selecting the right solution arise due to the wide range of power conversion products provided by diferent manufacturers. These products range from discrete components (to build converters) to complete power conversion modules that employ diferent manufacturing technologies. Consequently, in most cases it is not possible to analyze all the alternatives (combinations of power architectures and converters) that can be built. The designer has to select a limited number of converters in order to simplify the analysis. In this thesis, in order to overcome the mentioned dificulties, a new design methodology for power supply systems is proposed. This methodology integrates evolutionary computation techniques in order to make possible analyzing a large number of possibilities. This exhaustive analysis helps the designer to quickly define a set of feasible solutions and select the best trade-off in performance according to each application. The proposed approach consists of two key steps, one for the automatic generation of architectures and other for the optimized selection of components. In this thesis are detailed the implementation of these two steps. The usefulness of the methodology is corroborated by contrasting the results using real problems and experiments designed to test the limits of the algorithms.
Resumo:
The need to refine models for best-estimate calculations, based on good-quality experimental data, has been expressed in many recent meetings in the field of nuclear applications. The modeling needs arising in this respect should not be limited to the currently available macroscopic methods but should be extended to next-generation analysis techniques that focus on more microscopic processes. One of the most valuable databases identified for the thermalhydraulics modeling was developed by the Nuclear Power Engineering Corporation (NUPEC), Japan. From 1987 to 1995, NUPEC performed steady-state and transient critical power and departure from nucleate boiling (DNB) test series based on the equivalent full-size mock-ups. Considering the reliability not only of the measured data, but also other relevant parameters such as the system pressure, inlet sub-cooling and rod surface temperature, these test series supplied the first substantial database for the development of truly mechanistic and consistent models for boiling transition and critical heat flux. Over the last few years the Pennsylvania State University (PSU) under the sponsorship of the U.S. Nuclear Regulatory Commission (NRC) has prepared, organized, conducted and summarized the OECD/NRC Full-size Fine-mesh Bundle Tests (BFBT) Benchmark. The international benchmark activities have been conducted in cooperation with the Nuclear Energy Agency/Organization for Economic Co-operation and Development (NEA/OECD) and Japan Nuclear Energy Safety (JNES) organization, Japan. Consequently, the JNES has made available the Boiling Water Reactor (BWR) NUPEC database for the purposes of the benchmark. Based on the success of the OECD/NRC BFBT benchmark the JNES has decided to release also the data based on the NUPEC Pressurized Water Reactor (PWR) subchannel and bundle tests for another follow-up international benchmark entitled OECD/NRC PWR Subchannel and Bundle Tests (PSBT) benchmark. This paper presents an application of the joint Penn State University/Technical University of Madrid (UPM) version of the well-known subchannel code COBRA-TF, namely CTF, to the critical power and departure from nucleate boiling (DNB) exercises of the OECD/NRC BFBT and PSBT benchmarks
Resumo:
As a result of advances in mobile technology, new services which benefit from the ubiquity of these devices are appearing. Some of these services require the identification of the subject since they may access private user information. In this paper, we propose to identify each user by drawing his/her handwritten signature in the air (in-airsignature). In order to assess the feasibility of an in-airsignature as a biometric feature, we have analysed the performance of several well-known patternrecognitiontechniques—Hidden Markov Models, Bayes classifiers and dynamic time warping—to cope with this problem. Each technique has been tested in the identification of the signatures of 96 individuals. Furthermore, the robustness of each method against spoofing attacks has also been analysed using six impostors who attempted to emulate every signature. The best results in both experiments have been reached by using a technique based on dynamic time warping which carries out the recognition by calculating distances to an average template extracted from several training instances. Finally, a permanence analysis has been carried out in order to assess the stability of in-airsignature over time.