820 resultados para Search-based algorithms
Resumo:
Internet is increasingly used as a source of information on health issues and is probably a major source of patients' empowerment. This process is however limited by the frequently poor quality of web-based health information designed for consumers. A better diffusion of information about criteria defining the quality of the content of websites, and about useful methods designed for searching such needed information, could be particularly useful to patients and their relatives. A brief, six-items DISCERN version, characterized by a high specificity for detecting websites with good or very good content quality was recently developed. This tool could facilitate the identification of high-quality information on the web by patients and may improve the empowerment process initiated by the development of the health-related web.
Resumo:
BACKGROUND: The Internet is increasingly used as a source of information for mental health issues. The burden of obsessive compulsive disorder (OCD) may lead persons with diagnosed or undiagnosed OCD, and their relatives, to search for good quality information on the Web. This study aimed to evaluate the quality of Web-based information on English-language sites dealing with OCD and to compare the quality of websites found through a general and a medically specialized search engine. METHODS: Keywords related to OCD were entered into Google and OmniMedicalSearch. Websites were assessed on the basis of accountability, interactivity, readability, and content quality. The "Health on the Net" (HON) quality label and the Brief DISCERN scale score were used as possible content quality indicators. Of the 235 links identified, 53 websites were analyzed. RESULTS: The content quality of the OCD websites examined was relatively good. The use of a specialized search engine did not offer an advantage in finding websites with better content quality. A score ≥16 on the Brief DISCERN scale is associated with better content quality. CONCLUSION: This study shows the acceptability of the content quality of OCD websites. There is no advantage in searching for information with a specialized search engine rather than a general one. Practical implications: The Internet offers a number of high quality OCD websites. It remains critical, however, to have a provider-patient talk about the information found on the Web.
Resumo:
Background: Respiratory care is universally recognised as useful, but its indications and practice vary markedly. In order to improve appropriateness of respiratory care in our hospital, we developed evidence-based local guidelines in a collaborative effort involving physiotherapists, physicians, and health services researchers. Methods: Recommendations were developed using the standardised RAND appropriateness method. A literature search was performed for the period between 1995 and 2008 based on terms associated with guidelines and with respiratory care. Publications were assessed according to the Oxford classification of quality of evidence. A working group prepared proposals for recommendations which were then independently rated by a multidisciplinary expert panel. All recommendations were then discussed in common and indications for procedures were rated confidentially a second time by the experts. Each indication for respiratory care was classified as appropriate, uncertain, or inappropriate, based on the panel median rating and the degree of intra-panel agreement. Results: Recommendations were formulated for the following procedures: non-invasive ventilation, continuous positive airway pressure, intermittent positive pressure breathing, intrapulmonary percussive ventilation, mechanical insufflation-exsufflation, incentive spirometry, positive expiratory pressure, nasotracheal suctioning, noninstrumental airway clearance techniques. Each recommendation referred to a particular medical condition, and was assigned to a hierarchical category based on the quality of evidence from literature supporting the recommendation and on the consensus of experts. Conclusion: Despite a marked heterogeneity of scientific evidence, the method used allowed us to develop commonly agreed local guidelines for respiratory care. In addition, this work fostered a closer relationship between physiotherapists and physicians in our institution.
Resumo:
Individuals need to adapt to their local environment in order to survive. When selection pressures differ in local populations, polymorphism can evolve. Colour polymorphism is one of the most obvious polymorphisms since it is readily observable. Different sources of colouration exist, but melanin-based colouration is one of the most common in birds. The melanocortin system produces this colouration and because the melanocortin system has pleiotropic effects on behavioural and physiological traits, it is a good candidate to be an underlying mechanism to explain the maintenance of colour polymorphism. In this thesis I studied three different raptors which all display melanin-based colouration; barn owls (Tyto alba), tawny owls (Strix aluco) and Eurasian kestrels (Falco tinnunculus). The main question was if there was a relationship between melanin-based colouration and individual behavioural differences. The underlying hypothesis is that colour could be a signal of certain adaptive traits. Our goal was to find evolutionary explanations for the persistence of colour polymorphism. I found that nestling kestrels and barn owls differ in anti-predatory behaviour, with respect to their melanic colouration (chapters 1 and 2). Darker individuals show less reaction to human handling, but in kestrels aggression and colouration are related in opposite ways than in barn owls. More reddish barn owls travel greater distances in natal dispersal and this behaviour is repeatable between parents and same sex offspring (chapter 3). Dark reddish tawny owls defend their nests more intensely against intruders and appear to suffer less from nest predation (chapter 4). Finally I show that polymorphism in the Melanocortin 1 receptor gene (MC1R), which is strongly correlated with reddish colouration in the barn owl, is related to natal dispersal distance, providing a first indication for a genetic basis of the relation between this behaviour and colouration (chapter 5). My results demonstrate a clear link between melanin-based colouration and animal personality traits. I demonstrated this relation in three different species, which shows there is most likely a general underlying mechanism responsible. Different predation pressures might have shaped the reactions to predation, but also differences in sex-related colouration. Male-like and female-like colouration might signal more or less aggressive behaviour. Fluctuating environmental conditions might cause different individual strategies to produce equal reproductive success. The melanocortin system with its pleiotropic effects might be an underlying mechanism, as suggested by the results from the genetic polymorphism, the similar results found in these three species and by the similar relations reported in other species. This thesis demonstrates that colouration and individual differences are correlated and it provides the first glimpse of an underlying system. We can now conduct a more directed search for underlying mechanisms and evolutionary explanations with the use of quantitative genetic methods.
Resumo:
One of the unresolved questions of modern physics is the nature of Dark Matter. Strong experimental evidences suggest that the presence of this elusive component in the energy budget of the Universe is quite significant, without, however, being able to provide conclusive information about its nature. The most plausible scenario is that of weakly interacting massive particles (WIMPs), that includes a large class of non-baryonic Dark Matter candidates with a mass typically between few tens of GeV and few TeVs, and a cross section of the order of weak interactions. Search for Dark Matter particles using very high energy gamma-ray Cherenkov telescopes is based on the model that WIMPs can self-annihilate, leading to production of detectable species, like photons. These photons are very energetic, and since unreflected by the Universe's magnetic fields, they can be traced straight to the source of their creation. The downside of the approach is a great amount of background radiation, coming from the conventional astrophysical objects, that usually hides clear signals of the Dark Matter particle interactions. That is why good choice of the observational candidates is the crucial factor in search for Dark Matter. With MAGIC (Major Atmospheric Gamma-ray Imaging Cherenkov Telescopes), a two-telescope ground-based system located in La Palma, Canary Islands, we choose objects like dwarf spheroidal satellite galaxies of the Milky Way and galaxy clusters for our search. Our idea is to increase chances for WIMPs detection by pointing to objects that are relatively close, with great amount of Dark Matter and with as-little-as-possible pollution from the stars. At the moment, several observation projects are ongoing and analyses are being performed.
Resumo:
In this paper, we define a new scheme to develop and evaluate protection strategies for building reliable GMPLS networks. This is based on what we have called the network protection degree (NPD). The NPD consists of an a priori evaluation, the failure sensibility degree (FSD), which provides the failure probability, and an a posteriori evaluation, the failure impact degree (FID), which determines the impact on the network in case of failure, in terms of packet loss and recovery time. Having mathematical formulated these components, experimental results demonstrate the benefits of the utilization of the NPD, when used to enhance some current QoS routing algorithms in order to offer a certain degree of protection
Resumo:
IP based networks still do not have the required degree of reliability required by new multimedia services, achieving such reliability will be crucial in the success or failure of the new Internet generation. Most of existing schemes for QoS routing do not take into consideration parameters concerning the quality of the protection, such as packet loss or restoration time. In this paper, we define a new paradigm to develop new protection strategies for building reliable MPLS networks, based on what we have called the network protection degree (NPD). This NPD consists of an a priori evaluation, the failure sensibility degree (FSD), which provides the failure probability and an a posteriori evaluation, the failure impact degree (FID), to determine the impact on the network in case of failure. Having mathematical formulated these components, we point out the most relevant components. Experimental results demonstrate the benefits of the utilization of the NPD, when used to enhance some current QoS routing algorithms to offer a certain degree of protection
Resumo:
This paper presents a hybrid behavior-based scheme using reinforcement learning for high-level control of autonomous underwater vehicles (AUVs). Two main features of the presented approach are hybrid behavior coordination and semi on-line neural-Q_learning (SONQL). Hybrid behavior coordination takes advantages of robustness and modularity in the competitive approach as well as efficient trajectories in the cooperative approach. SONQL, a new continuous approach of the Q_learning algorithm with a multilayer neural network is used to learn behavior state/action mapping online. Experimental results show the feasibility of the presented approach for AUVs
Resumo:
This paper presents an automatic vision-based system for UUV station keeping. The vehicle is equipped with a down-looking camera, which provides images of the sea-floor. The station keeping system is based on a feature-based motion detection algorithm, which exploits standard correlation and explicit textural analysis to solve the correspondence problem. A visual map of the area surveyed by the vehicle is constructed to increase the flexibility of the system, allowing the vehicle to position itself when it has lost the reference image. The testing platform is the URIS underwater vehicle. Experimental results demonstrating the behavior of the system on a real environment are presented
Resumo:
When unmanned underwater vehicles (UUVs) perform missions near the ocean floor, optical sensors can be used to improve local navigation. Video mosaics allow to efficiently process the images acquired by the vehicle, and also to obtain position estimates. We discuss in this paper the role of lens distortions in this context, proving that degenerate mosaics have their origin not only in the selected motion model or in registration errors, but also in the cumulative effect of radial distortion residuals. Additionally, we present results on the accuracy of different feature-based approaches for self-correction of lens distortions that may guide the choice of appropriate techniques for correcting distortions
Resumo:
In a search for new sensor systems and new methods for underwater vehicle positioning based on visual observation, this paper presents a computer vision system based on coded light projection. 3D information is taken from an underwater scene. This information is used to test obstacle avoidance behaviour. In addition, the main ideas for achieving stabilisation of the vehicle in front of an object are presented
Resumo:
This paper presents a study of connection availability in GMPLS over optical transport networks (OTN) taking into account different network topologies. Two basic path protection schemes are considered and compared with the no protection case. The selected topologies are heterogeneous in geographic coverage, network diameter, link lengths, and average node degree. Connection availability is also computed considering the reliability data of physical components and a well-known network availability model. Results show several correspondences between suitable path protection algorithms and several network topology characteristics
Resumo:
Objectives: The study objective was to derive reference pharmacokinetic curves of antiretroviral drugs (ART) based on available population pharmacokinetic (Pop-PK) studies that can be used to optimize therapeutic drug monitoring guided dosage adjustment.¦Methods: A systematic search of Pop-PK studies of 8 ART in adults was performed in PubMed. To simulate reference PK curves, a summary of the PK parameters was obtained for each drug based on meta-analysis approach. Most models used one-compartment model, thus chosen as reference model. Models using bi-exponential disposition were simplified to one-compartment, since the first distribution phase was rapid and not determinant for the description of the terminal elimination phase, mostly relevant for this project. Different absorption were standardized for first-order absorption processes.¦Apparent clearance (CL), apparent volume of distribution of the terminal phase (Vz) and absorption rate constant (ka) and inter-individual variability were pooled into summary mean value, weighted by number of plasma levels; intra-individual variability was weighted by number of individuals in each study.¦Simulations based on summary PK parameters served to construct concentration PK percentiles (NONMEM®).¦Concordance between individual and summary parameters was assessed graphically using Forest-plots. To test robustness, difference in simulated curves based on published and summary parameters was calculated using efavirenz as probe drug.¦Results: CL was readily accessible from all studies. For studies with one-compartment, Vz was central volume of distribution; for two-compartment, Vz was CL/λz. ka was directly used or derived based on the mean absorption time (MAT) for more complicated absorption models, assuming MAT=1/ka.¦The value of CL for each drug was in excellent agreement throughout all Pop-PK models, suggesting that minimal concentration derived from summary models was adequately characterized. The comparison of the concentration vs. time profile for efavirenz between published and summary PK parameters revealed not more than 20% difference. Although our approach appears adequate for estimation of elimination phase, the simplification of absorption phase might lead to small bias shortly after drug intake.¦Conclusions: Simulated reference percentile curves based on such an approach represent a useful tool for interpretating drug concentrations. This Pop-PK meta-analysis approach should be further validated and could be extended to elaborate more sophisticated computerized tool for the Bayesian TDM of ART.
Resumo:
This letter presents a comparison between threeFourier-based motion compensation (MoCo) algorithms forairborne synthetic aperture radar (SAR) systems. These algorithmscircumvent the limitations of conventional MoCo, namelythe assumption of a reference height and the beam-center approximation.All these approaches rely on the inherent time–frequencyrelation in SAR systems but exploit it differently, with the consequentdifferences in accuracy and computational burden. Aftera brief overview of the three approaches, the performance ofeach algorithm is analyzed with respect to azimuthal topographyaccommodation, angle accommodation, and maximum frequencyof track deviations with which the algorithm can cope. Also, ananalysis on the computational complexity is presented. Quantitativeresults are shown using real data acquired by the ExperimentalSAR system of the German Aerospace Center (DLR).
Resumo:
Summary Cancer is a leading cause of morbidity and mortality in Western countries (as an example, colorectal cancer accounts for about 300'000 new cases and 200'000 deaths each year in Europe and in the USA). Despite that many patients with cancer have complete macroscopic clearance of their disease after resection, radiotherapy and/or chemotherapy, many of these patients develop fatal recurrence. Vaccination with immunogenic peptide tumor antigens has shown encouraging progresses in the last decade; immunotherapy might therefore constitute a fourth therapeutic option in the future. We dissect here and critically evaluate the numerous steps of reverse immunology, a forecast procedure to identify antigenic peptides from the sequence of a gene of interest. Bioinformatic algorithms were applied to mine sequence databases for tumor-specific transcripts. A quality assessment of publicly available sequence databanks allowed defining strengths and weaknesses of bioinformatics-based prediction of colon cancer-specific alternative splicing: new splice variants could be identified, however cancer-restricted expression could not be significantly predicted. Other sources of target transcripts were quantitatively investigated by polymerase chain reactions, as cancer-testis genes or reported overexpressed transcripts. Based on the relative expression of a defined set of housekeeping genes in colon cancer tissues, we characterized a precise procedure for accurate normalization and determined a threshold for the definition of significant overexpression of genes in cancers versus normal tissues. Further steps of reverse immunology were applied on a splice variant of the Melan¬A gene. Since it is known that the C-termini of antigenic peptides are directly produced by the proteasome, longer precursor and overlapping peptides encoded by the target sequence were synthesized chemically and digested in vitro with purified proteasome. The resulting fragments were identified by mass spectroscopy to detect cleavage sites. Using this information and based on the available anchor motifs for defined HLA class I molecules, putative antigenic peptides could be predicted. Their relative affinity for HLA molecules was confirmed experimentally with functional competitive binding assays and they were used to search patients' peripheral blood lymphocytes for the presence of specific cytolytic T lymphocytes (CTL). CTL clones specific for a splice variant of Melan-A could be isolated; although they recognized peptide-pulsed cells, they failed to lyse melanoma cells in functional assays of antigen recognition. In the conclusion, we discuss advantages and bottlenecks of reverse immunology and compare the technical aspects of this approach with the more classical procedure of direct immunology, a technique introduced by Boon and colleagues more than 10 years ago to successfully clone tumor antigens.