886 resultados para Search-based technique
Resumo:
OBJECTIVES: To test whether the Global Positioning System (GPS) could be potentially useful to assess the velocity of walking and running in humans. SUBJECT: A young man was equipped with a GPS receptor while walking running and cycling at various velocity on an athletic track. The speed of displacement assessed by GPS, was compared to that directly measured by chronometry (76 tests). RESULTS: In walking and running conditions (from 2-20 km/h) as well as cycling conditions (from 20-40 km/h), there was a significant relationship between the speed assessed by GPS and that actually measured (r = 0.99, P < 0.0001) with little bias in the prediction of velocity. The overall error of prediction (s.d. of difference) averaged +/-0.8 km/h. CONCLUSION: The GPS technique appears very promising for speed assessment although the relative accuracy at walking speed is still insufficient for research purposes. It may be improved by using differential GPS measurement.
Resumo:
This paper proposes a field application of a high-level reinforcement learning (RL) control system for solving the action selection problem of an autonomous robot in cable tracking task. The learning system is characterized by using a direct policy search method for learning the internal state/action mapping. Policy only algorithms may suffer from long convergence times when dealing with real robotics. In order to speed up the process, the learning phase has been carried out in a simulated environment and, in a second step, the policy has been transferred and tested successfully on a real robot. Future steps plan to continue the learning process on-line while on the real robot while performing the mentioned task. We demonstrate its feasibility with real experiments on the underwater robot ICTINEU AUV
Resumo:
This paper proposes a high-level reinforcement learning (RL) control system for solving the action selection problem of an autonomous robot. Although the dominant approach, when using RL, has been to apply value function based algorithms, the system here detailed is characterized by the use of direct policy search methods. Rather than approximating a value function, these methodologies approximate a policy using an independent function approximator with its own parameters, trying to maximize the future expected reward. The policy based algorithm presented in this paper is used for learning the internal state/action mapping of a behavior. In this preliminary work, we demonstrate its feasibility with simulated experiments using the underwater robot GARBI in a target reaching task
Resumo:
This paper describes the improvements achieved in our mosaicking system to assist unmanned underwater vehicle navigation. A major advance has been attained in the processing of images of the ocean floor when light absorption effects are evident. Due to the absorption of natural light, underwater vehicles often require artificial light sources attached to them to provide the adequate illumination for processing underwater images. Unfortunately, these flashlights tend to illuminate the scene in a nonuniform fashion. In this paper a technique to correct non-uniform lighting is proposed. The acquired frames are compensated through a point-by-point division of the image by an estimation of the illumination field. Then, the gray-levels of the obtained image remapped to enhance image contrast. Experiments with real images are presented
Resumo:
Photo-mosaicing techniques have become popular for seafloor mapping in various marine science applications. However, the common methods cannot accurately map regions with high relief and topographical variations. Ortho-mosaicing borrowed from photogrammetry is an alternative technique that enables taking into account the 3-D shape of the terrain. A serious bottleneck is the volume of elevation information that needs to be estimated from the video data, fused, and processed for the generation of a composite ortho-photo that covers a relatively large seafloor area. We present a framework that combines the advantages of dense depth-map and 3-D feature estimation techniques based on visual motion cues. The main goal is to identify and reconstruct certain key terrain feature points that adequately represent the surface with minimal complexity in the form of piecewise planar patches. The proposed implementation utilizes local depth maps for feature selection, while tracking over several views enables 3-D reconstruction by bundle adjustment. Experimental results with synthetic and real data validate the effectiveness of the proposed approach
Resumo:
Positioning a robot with respect to objects by using data provided by a camera is a well known technique called visual servoing. In order to perform a task, the object must exhibit visual features which can be extracted from different points of view. Then, visual servoing is object-dependent as it depends on the object appearance. Therefore, performing the positioning task is not possible in presence of non-textured objects or objects for which extracting visual features is too complex or too costly. This paper proposes a solution to tackle this limitation inherent to the current visual servoing techniques. Our proposal is based on the coded structured light approach as a reliable and fast way to solve the correspondence problem. In this case, a coded light pattern is projected providing robust visual features independently of the object appearance
Resumo:
This paper presents the implementation details of a coded structured light system for rapid shape acquisition of unknown surfaces. Such techniques are based on the projection of patterns onto a measuring surface and grabbing images of every projection with a camera. Analyzing the pattern deformations that appear in the images, 3D information of the surface can be calculated. The implemented technique projects a unique pattern so that it can be used to measure moving surfaces. The structure of the pattern is a grid where the color of the slits are selected using a De Bruijn sequence. Moreover, since both axis of the pattern are coded, the cross points of the grid have two codewords (which permits to reconstruct them very precisely), while pixels belonging to horizontal and vertical slits have also a codeword. Different sets of colors are used for horizontal and vertical slits, so the resulting pattern is invariant to rotation. Therefore, the alignment constraint between camera and projector considered by a lot of authors is not necessary
Resumo:
In a search for new sensor systems and new methods for underwater vehicle positioning based on visual observation, this paper presents a computer vision system based on coded light projection. 3D information is taken from an underwater scene. This information is used to test obstacle avoidance behaviour. In addition, the main ideas for achieving stabilisation of the vehicle in front of an object are presented
Resumo:
The in situ deposition of zinc oxide on gold nanoparticles in aqueous solution has been here successfully applied in the field of fingermark detection on various non-porous surfaces. In this article, we present the improvement of the multimetal deposition, an existing technique limited up to now to non-luminescent results, by obtaining luminescent fingermarks with very good contrast and details. This is seen as a major improvement in the field in terms of selectivity and sensitivity of detection, especially on black surfaces.
Resumo:
Objectives: The study objective was to derive reference pharmacokinetic curves of antiretroviral drugs (ART) based on available population pharmacokinetic (Pop-PK) studies that can be used to optimize therapeutic drug monitoring guided dosage adjustment.¦Methods: A systematic search of Pop-PK studies of 8 ART in adults was performed in PubMed. To simulate reference PK curves, a summary of the PK parameters was obtained for each drug based on meta-analysis approach. Most models used one-compartment model, thus chosen as reference model. Models using bi-exponential disposition were simplified to one-compartment, since the first distribution phase was rapid and not determinant for the description of the terminal elimination phase, mostly relevant for this project. Different absorption were standardized for first-order absorption processes.¦Apparent clearance (CL), apparent volume of distribution of the terminal phase (Vz) and absorption rate constant (ka) and inter-individual variability were pooled into summary mean value, weighted by number of plasma levels; intra-individual variability was weighted by number of individuals in each study.¦Simulations based on summary PK parameters served to construct concentration PK percentiles (NONMEM®).¦Concordance between individual and summary parameters was assessed graphically using Forest-plots. To test robustness, difference in simulated curves based on published and summary parameters was calculated using efavirenz as probe drug.¦Results: CL was readily accessible from all studies. For studies with one-compartment, Vz was central volume of distribution; for two-compartment, Vz was CL/λz. ka was directly used or derived based on the mean absorption time (MAT) for more complicated absorption models, assuming MAT=1/ka.¦The value of CL for each drug was in excellent agreement throughout all Pop-PK models, suggesting that minimal concentration derived from summary models was adequately characterized. The comparison of the concentration vs. time profile for efavirenz between published and summary PK parameters revealed not more than 20% difference. Although our approach appears adequate for estimation of elimination phase, the simplification of absorption phase might lead to small bias shortly after drug intake.¦Conclusions: Simulated reference percentile curves based on such an approach represent a useful tool for interpretating drug concentrations. This Pop-PK meta-analysis approach should be further validated and could be extended to elaborate more sophisticated computerized tool for the Bayesian TDM of ART.
Resumo:
Background:¦Hirschsprung's disease (HSCR) is a congenital malformation of the enteric nervous system due to the¦arrest of migration of neural crest cells to form the myenteric and submucosal plexuses. It leads to an anganglionic intestinal segment, which is permanently contracted causing intestinal obstruction. Its incidence is approximately 1/5000 birth, and males are more frequently affected with a male/female ratio of 4/1. The diagnosis is in most cases made within the first year of life. The rectal biopsy of the mucosa and sub-mucosa is the diagnostic gold standard.¦Purpose:¦The aim of this study was to compare two surgical approaches for HSCR, the Duhamel technique and the transanal endorectal pull-through (TEPT) in term of indications, duration of surgery, duration of hospital stay, postoperative treatment, complications, frequency of enterocolitis and functional outcomes.¦Methods:¦Fifty-nine patients were treated for HSCR by one of the two methods in our department of pediatric¦surgery between 1994 and 2010. These patients were separated into two groups (I: Duhamel, II: TEPT), which were compared on the basis of medical records. Statistics were made to compare the two groups (ANOVA test). The first group includes 43 patients and the second 16 patients. It is noteworthy that twenty-four patients (about 41% of all¦patients) were referred from abroad (Western Africa). Continence was evaluated with the Krickenbeck's score.¦Results:¦Statistically, this study showed that operation duration, hospital stay, postoperative fasting and duration of postoperative antibiotics were significantly shorter (p value < 0.05) in group II (TEPT). But age at operation and length of aganglionic segment showed no significant difference between the two groups. The continence follow-up showed generally good results (Krickenbeck's scores 1; 2.1; 3.1) in both groups with a slight tendency to constipation in group I and soiling in group II.¦Conclusion:¦We found two indications for the Duhamel method that are being referred from a country without¦careful postoperative surveillance and/or having a previous colostomy. Even if the Duhamel technique tends to be replaced by the TEPT, it remains the best operative approach for some selected patients. TEPT has also proved some advantages but must be followed carefully because, among other points, of the postoperative dilatations. Our postoperative standards, like digital rectal examination and anal dilatations seem to reduce the occurrence of complications like rectal spur and anal/anastomosis stenosis, respectively in the Duhamel method and the TEPT technique.
Resumo:
Summary Cancer is a leading cause of morbidity and mortality in Western countries (as an example, colorectal cancer accounts for about 300'000 new cases and 200'000 deaths each year in Europe and in the USA). Despite that many patients with cancer have complete macroscopic clearance of their disease after resection, radiotherapy and/or chemotherapy, many of these patients develop fatal recurrence. Vaccination with immunogenic peptide tumor antigens has shown encouraging progresses in the last decade; immunotherapy might therefore constitute a fourth therapeutic option in the future. We dissect here and critically evaluate the numerous steps of reverse immunology, a forecast procedure to identify antigenic peptides from the sequence of a gene of interest. Bioinformatic algorithms were applied to mine sequence databases for tumor-specific transcripts. A quality assessment of publicly available sequence databanks allowed defining strengths and weaknesses of bioinformatics-based prediction of colon cancer-specific alternative splicing: new splice variants could be identified, however cancer-restricted expression could not be significantly predicted. Other sources of target transcripts were quantitatively investigated by polymerase chain reactions, as cancer-testis genes or reported overexpressed transcripts. Based on the relative expression of a defined set of housekeeping genes in colon cancer tissues, we characterized a precise procedure for accurate normalization and determined a threshold for the definition of significant overexpression of genes in cancers versus normal tissues. Further steps of reverse immunology were applied on a splice variant of the Melan¬A gene. Since it is known that the C-termini of antigenic peptides are directly produced by the proteasome, longer precursor and overlapping peptides encoded by the target sequence were synthesized chemically and digested in vitro with purified proteasome. The resulting fragments were identified by mass spectroscopy to detect cleavage sites. Using this information and based on the available anchor motifs for defined HLA class I molecules, putative antigenic peptides could be predicted. Their relative affinity for HLA molecules was confirmed experimentally with functional competitive binding assays and they were used to search patients' peripheral blood lymphocytes for the presence of specific cytolytic T lymphocytes (CTL). CTL clones specific for a splice variant of Melan-A could be isolated; although they recognized peptide-pulsed cells, they failed to lyse melanoma cells in functional assays of antigen recognition. In the conclusion, we discuss advantages and bottlenecks of reverse immunology and compare the technical aspects of this approach with the more classical procedure of direct immunology, a technique introduced by Boon and colleagues more than 10 years ago to successfully clone tumor antigens.
Resumo:
BACKGROUND Functional brain images such as Single-Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) have been widely used to guide the clinicians in the Alzheimer's Disease (AD) diagnosis. However, the subjectivity involved in their evaluation has favoured the development of Computer Aided Diagnosis (CAD) Systems. METHODS It is proposed a novel combination of feature extraction techniques to improve the diagnosis of AD. Firstly, Regions of Interest (ROIs) are selected by means of a t-test carried out on 3D Normalised Mean Square Error (NMSE) features restricted to be located within a predefined brain activation mask. In order to address the small sample-size problem, the dimension of the feature space was further reduced by: Large Margin Nearest Neighbours using a rectangular matrix (LMNN-RECT), Principal Component Analysis (PCA) or Partial Least Squares (PLS) (the two latter also analysed with a LMNN transformation). Regarding the classifiers, kernel Support Vector Machines (SVMs) and LMNN using Euclidean, Mahalanobis and Energy-based metrics were compared. RESULTS Several experiments were conducted in order to evaluate the proposed LMNN-based feature extraction algorithms and its benefits as: i) linear transformation of the PLS or PCA reduced data, ii) feature reduction technique, and iii) classifier (with Euclidean, Mahalanobis or Energy-based methodology). The system was evaluated by means of k-fold cross-validation yielding accuracy, sensitivity and specificity values of 92.78%, 91.07% and 95.12% (for SPECT) and 90.67%, 88% and 93.33% (for PET), respectively, when a NMSE-PLS-LMNN feature extraction method was used in combination with a SVM classifier, thus outperforming recently reported baseline methods. CONCLUSIONS All the proposed methods turned out to be a valid solution for the presented problem. One of the advances is the robustness of the LMNN algorithm that not only provides higher separation rate between the classes but it also makes (in combination with NMSE and PLS) this rate variation more stable. In addition, their generalization ability is another advance since several experiments were performed on two image modalities (SPECT and PET).
Resumo:
African Americans are disproportionately affected by type 2 diabetes (T2DM) yet few studies have examined T2DM using genome-wide association approaches in this ethnicity. The aim of this study was to identify genes associated with T2DM in the African American population. We performed a Genome Wide Association Study (GWAS) using the Affymetrix 6.0 array in 965 African-American cases with T2DM and end-stage renal disease (T2DM-ESRD) and 1029 population-based controls. The most significant SNPs (n = 550 independent loci) were genotyped in a replication cohort and 122 SNPs (n = 98 independent loci) were further tested through genotyping three additional validation cohorts followed by meta-analysis in all five cohorts totaling 3,132 cases and 3,317 controls. Twelve SNPs had evidence of association in the GWAS (P<0.0071), were directionally consistent in the Replication cohort and were associated with T2DM in subjects without nephropathy (P<0.05). Meta-analysis in all cases and controls revealed a single SNP reaching genome-wide significance (P<2.5×10(-8)). SNP rs7560163 (P = 7.0×10(-9), OR (95% CI) = 0.75 (0.67-0.84)) is located intergenically between RND3 and RBM43. Four additional loci (rs7542900, rs4659485, rs2722769 and rs7107217) were associated with T2DM (P<0.05) and reached more nominal levels of significance (P<2.5×10(-5)) in the overall analysis and may represent novel loci that contribute to T2DM. We have identified novel T2DM-susceptibility variants in the African-American population. Notably, T2DM risk was associated with the major allele and implies an interesting genetic architecture in this population. These results suggest that multiple loci underlie T2DM susceptibility in the African-American population and that these loci are distinct from those identified in other ethnic populations.
Resumo:
The Global Program for the Elimination of Lymphatic Filariasis (GPELF) aims to eliminate this disease by the year 2020. However, the development of more specific and sensitive tests is important for the success of the GPELF. The present study aimed to standardise polymerase chain reaction (PCR)-based systems for the diagnosis of filariasis in serum and urine. Twenty paired biological urine and serum samples from individuals already known to be positive for Wuchereria bancrofti were collected during the day. Conventional PCR and semi-nested PCR assays were optimised. The detection limit of the technique for purified W. bancrofti DNA extracted from adult worms was 10 fg for the internal systems (WbF/Wb2) and 0.1 fg by using semi-nested PCR. The specificity of the primers was confirmed experimentally by amplification of 1 ng of purified genomic DNA from other species of parasites. Evaluation of the paired urine and serum samples by the semi-nested PCR technique indicated only two of the 20 tested individuals were positive, whereas the simple internal PCR system (WbF/Wb2), which has highly promising performance, revealed that all the patients were positive using both samples. This study successfully demonstrated the possibility of using the PCR technique on urine for the diagnosis of W. bancrofti infection.