886 resultados para Search-based technique
Resumo:
Malaria is responsible for more deaths around the world than any other parasitic disease. Due to the emergence of strains that are resistant to the current chemotherapeutic antimalarial arsenal, the search for new antimalarial drugs remains urgent though hampered by a lack of knowledge regarding the molecular mechanisms of artemisinin resistance. Semisynthetic compounds derived from diterpenes from the medicinal plant Wedelia paludosawere tested in silico against the Plasmodium falciparumCa2+-ATPase, PfATP6. This protein was constructed by comparative modelling using the three-dimensional structure of a homologous protein, 1IWO, as a scaffold. Compound 21 showed the best docking scores, indicating a better interaction with PfATP6 than that of thapsigargin, the natural inhibitor. Inhibition of PfATP6 by diterpene compounds could promote a change in calcium homeostasis, leading to parasite death. These data suggest PfATP6 as a potential target for the antimalarial ent-kaurane diterpenes.
Resumo:
BACKGROUND: The Internet is increasingly used as a source of information for mental health issues. The burden of obsessive compulsive disorder (OCD) may lead persons with diagnosed or undiagnosed OCD, and their relatives, to search for good quality information on the Web. This study aimed to evaluate the quality of Web-based information on English-language sites dealing with OCD and to compare the quality of websites found through a general and a medically specialized search engine. METHODS: Keywords related to OCD were entered into Google and OmniMedicalSearch. Websites were assessed on the basis of accountability, interactivity, readability, and content quality. The "Health on the Net" (HON) quality label and the Brief DISCERN scale score were used as possible content quality indicators. Of the 235 links identified, 53 websites were analyzed. RESULTS: The content quality of the OCD websites examined was relatively good. The use of a specialized search engine did not offer an advantage in finding websites with better content quality. A score ≥16 on the Brief DISCERN scale is associated with better content quality. CONCLUSION: This study shows the acceptability of the content quality of OCD websites. There is no advantage in searching for information with a specialized search engine rather than a general one. Practical implications: The Internet offers a number of high quality OCD websites. It remains critical, however, to have a provider-patient talk about the information found on the Web.
Resumo:
Nowadays, the joint exploitation of images acquired daily by remote sensing instruments and of images available from archives allows a detailed monitoring of the transitions occurring at the surface of the Earth. These modifications of the land cover generate spectral discrepancies that can be detected via the analysis of remote sensing images. Independently from the origin of the images and of type of surface change, a correct processing of such data implies the adoption of flexible, robust and possibly nonlinear method, to correctly account for the complex statistical relationships characterizing the pixels of the images. This Thesis deals with the development and the application of advanced statistical methods for multi-temporal optical remote sensing image processing tasks. Three different families of machine learning models have been explored and fundamental solutions for change detection problems are provided. In the first part, change detection with user supervision has been considered. In a first application, a nonlinear classifier has been applied with the intent of precisely delineating flooded regions from a pair of images. In a second case study, the spatial context of each pixel has been injected into another nonlinear classifier to obtain a precise mapping of new urban structures. In both cases, the user provides the classifier with examples of what he believes has changed or not. In the second part, a completely automatic and unsupervised method for precise binary detection of changes has been proposed. The technique allows a very accurate mapping without any user intervention, resulting particularly useful when readiness and reaction times of the system are a crucial constraint. In the third, the problem of statistical distributions shifting between acquisitions is studied. Two approaches to transform the couple of bi-temporal images and reduce their differences unrelated to changes in land cover are studied. The methods align the distributions of the images, so that the pixel-wise comparison could be carried out with higher accuracy. Furthermore, the second method can deal with images from different sensors, no matter the dimensionality of the data nor the spectral information content. This opens the doors to possible solutions for a crucial problem in the field: detecting changes when the images have been acquired by two different sensors.
Resumo:
Earthquakes occurring around the world each year cause thousands ofdeaths, millions of dollars in damage to infrastructure, and incalculablehuman suffering. In recent years, satellite technology has been asignificant boon to response efforts following an earthquake and itsafter-effects by providing mobile communications between response teamsand remote sensing of damaged areas to disaster management organizations.In 2007, an international team of students and professionals assembledduring theInternational Space University’s Summer Session Program in Beijing, Chinato examine how satellite and ground-based technology could be betterintegrated to provide an optimised response in the event of an earthquake.The resulting Technology Resources for Earthquake MOnitoring and Response(TREMOR) proposal describes an integrative prototype response system thatwill implement mobile satellite communication hubs providing telephone anddata links between response teams, onsite telemedicine consultation foremergency first-responders, and satellite navigation systems that willlocate and track emergency vehicles and guide search-and-rescue crews. Aprototype earthquake simulation system is also proposed, integratinghistorical data, earthquake precursor data, and local geomatics andinfrastructure information to predict the damage that could occur in theevent of an earthquake. The backbone of these proposals is a comprehensiveeducation and training program to help individuals, communities andgovernments prepare in advance. The TREMOR team recommends thecoordination of these efforts through a centralised, non-governmentalorganization.
Resumo:
Over thirty years ago, Leamer (1983) - among many others - expressed doubts about the quality and usefulness of empirical analyses for the economic profession by stating that "hardly anyone takes data analyses seriously. Or perhaps more accurately, hardly anyone takes anyone else's data analyses seriously" (p.37). Improvements in data quality, more robust estimation methods and the evolution of better research designs seem to make that assertion no longer justifiable (see Angrist and Pischke (2010) for a recent response to Leamer's essay). The economic profes- sion and policy makers alike often rely on empirical evidence as a means to investigate policy relevant questions. The approach of using scientifically rigorous and systematic evidence to identify policies and programs that are capable of improving policy-relevant outcomes is known under the increasingly popular notion of evidence-based policy. Evidence-based economic policy often relies on randomized or quasi-natural experiments in order to identify causal effects of policies. These can require relatively strong assumptions or raise concerns of external validity. In the context of this thesis, potential concerns are for example endogeneity of policy reforms with respect to the business cycle in the first chapter, the trade-off between precision and bias in the regression-discontinuity setting in chapter 2 or non-representativeness of the sample due to self-selection in chapter 3. While the identification strategies are very useful to gain insights into the causal effects of specific policy questions, transforming the evidence into concrete policy conclusions can be challenging. Policy develop- ment should therefore rely on the systematic evidence of a whole body of research on a specific policy question rather than on a single analysis. In this sense, this thesis cannot and should not be viewed as a comprehensive analysis of specific policy issues but rather as a first step towards a better understanding of certain aspects of a policy question. The thesis applies new and innovative identification strategies to policy-relevant and topical questions in the fields of labor economics and behavioral environmental economics. Each chapter relies on a different identification strategy. In the first chapter, we employ a difference- in-differences approach to exploit the quasi-experimental change in the entitlement of the max- imum unemployment benefit duration to identify the medium-run effects of reduced benefit durations on post-unemployment outcomes. Shortening benefit duration carries a double- dividend: It generates fiscal benefits without deteriorating the quality of job-matches. On the contrary, shortened benefit durations improve medium-run earnings and employment possibly through containing the negative effects of skill depreciation or stigmatization. While the first chapter provides only indirect evidence on the underlying behavioral channels, in the second chapter I develop a novel approach that allows to learn about the relative impor- tance of the two key margins of job search - reservation wage choice and search effort. In the framework of a standard non-stationary job search model, I show how the exit rate from un- employment can be decomposed in a way that is informative on reservation wage movements over the unemployment spell. The empirical analysis relies on a sharp discontinuity in unem- ployment benefit entitlement, which can be exploited in a regression-discontinuity approach to identify the effects of extended benefit durations on unemployment and survivor functions. I find evidence that calls for an important role of reservation wage choices for job search be- havior. This can have direct implications for the optimal design of unemployment insurance policies. The third chapter - while thematically detached from the other chapters - addresses one of the major policy challenges of the 21st century: climate change and resource consumption. Many governments have recently put energy efficiency on top of their agendas. While pricing instru- ments aimed at regulating the energy demand have often been found to be short-lived and difficult to enforce politically, the focus of energy conservation programs has shifted towards behavioral approaches - such as provision of information or social norm feedback. The third chapter describes a randomized controlled field experiment in which we discuss the effective- ness of different types of feedback on residential electricity consumption. We find that detailed and real-time feedback caused persistent electricity reductions on the order of 3 to 5 % of daily electricity consumption. Also social norm information can generate substantial electricity sav- ings when designed appropriately. The findings suggest that behavioral approaches constitute effective and relatively cheap way of improving residential energy-efficiency.
Resumo:
The interfacial micromotion is closely associated to the long-term success of cementless hip prostheses. Various techniques have been proposed to measure them, but only a few number of points over the stem surface can be measured simultaneously. In this paper, we propose a new technique based on micro-Computer Tomography (μCT) to measure locally the relative interfacial micromotions between the metallic stem and the surrounding femoral bone. Tantalum beads were stuck at the stem surface and spread at the endosteal surface. Relative micromotions between the stem and the endosteal bone surfaces were measured at different loading amplitudes. The estimated error was 10μm and the maximal micromotion was 60μm, in the loading direction, at 1400N. This pilot study provided a local measurement of the micromotions in the 3 direction and at 8 locations on the stem surface simultaneously. This technique could be easily extended to higher loads and a much larger number of points, covering the entire stem surface and providing a quasi-continuous distribution of the 3D interfacial micromotions around the stem. The new measurement method would be very useful to compare the induced micromotions of different stem designs and to optimize the primary stability of cementless total hip arthroplasty.
Resumo:
Background: Choosing an adequate measurement instrument depends on the proposed use of the instrument, the concept to be measured, the measurement properties (e.g. internal consistency, reproducibility, content and construct validity, responsiveness, and interpretability), the requirements, the burden for subjects, and costs of the available instruments. As far as measurement properties are concerned, there are no sufficiently specific standards for the evaluation of measurement properties of instruments to measure health status, and also no explicit criteria for what constitutes good measurement properties. In this paper we describe the protocol for the COSMIN study, the objective of which is to develop a checklist that contains COnsensus-based Standards for the selection of health Measurement INstruments, including explicit criteria for satisfying these standards. We will focus on evaluative health related patient-reported outcomes (HR-PROs), i.e. patient-reported health measurement instruments used in a longitudinal design as an outcome measure, excluding health care related PROs, such as satisfaction with care or adherence. The COSMIN standards will be made available in the form of an easily applicable checklist.Method: An international Delphi study will be performed to reach consensus on which and how measurement properties should be assessed, and on criteria for good measurement properties. Two sources of input will be used for the Delphi study: (1) a systematic review of properties, standards and criteria of measurement properties found in systematic reviews of measurement instruments, and (2) an additional literature search of methodological articles presenting a comprehensive checklist of standards and criteria. The Delphi study will consist of four (written) Delphi rounds, with approximately 30 expert panel members with different backgrounds in clinical medicine, biostatistics, psychology, and epidemiology. The final checklist will subsequently be field-tested by assessing the inter-rater reproducibility of the checklist.Discussion: Since the study will mainly be anonymous, problems that are commonly encountered in face-to-face group meetings, such as the dominance of certain persons in the communication process, will be avoided. By performing a Delphi study and involving many experts, the likelihood that the checklist will have sufficient credibility to be accepted and implemented will increase.
Resumo:
BACKGROUND: The reverse transcription quantitative real-time polymerase chain reaction (RT-qPCR) is a widely used, highly sensitive laboratory technique to rapidly and easily detect, identify and quantify gene expression. Reliable RT-qPCR data necessitates accurate normalization with validated control genes (reference genes) whose expression is constant in all studied conditions. This stability has to be demonstrated.We performed a literature search for studies using quantitative or semi-quantitative PCR in the rat spared nerve injury (SNI) model of neuropathic pain to verify whether any reference genes had previously been validated. We then analyzed the stability over time of 7 commonly used reference genes in the nervous system - specifically in the spinal cord dorsal horn and the dorsal root ganglion (DRG). These were: Actin beta (Actb), Glyceraldehyde-3-phosphate dehydrogenase (GAPDH), ribosomal proteins 18S (18S), L13a (RPL13a) and L29 (RPL29), hypoxanthine phosphoribosyltransferase 1 (HPRT1) and hydroxymethylbilane synthase (HMBS). We compared the candidate genes and established a stability ranking using the geNorm algorithm. Finally, we assessed the number of reference genes necessary for accurate normalization in this neuropathic pain model. RESULTS: We found GAPDH, HMBS, Actb, HPRT1 and 18S cited as reference genes in literature on studies using the SNI model. Only HPRT1 and 18S had been once previously demonstrated as stable in RT-qPCR arrays. All the genes tested in this study, using the geNorm algorithm, presented gene stability values (M-value) acceptable enough for them to qualify as potential reference genes in both DRG and spinal cord. Using the coefficient of variation, 18S failed the 50% cut-off with a value of 61% in the DRG. The two most stable genes in the dorsal horn were RPL29 and RPL13a; in the DRG they were HPRT1 and Actb. Using a 0.15 cut-off for pairwise variations we found that any pair of stable reference gene was sufficient for the normalization process. CONCLUSIONS: In the rat SNI model, we validated and ranked Actb, RPL29, RPL13a, HMBS, GAPDH, HPRT1 and 18S as good reference genes in the spinal cord. In the DRG, 18S did not fulfill stability criteria. The combination of any two stable reference genes was sufficient to provide an accurate normalization.
Resumo:
Intraoperative examination of sentinel axillary lymph nodes can be done by imprint cytology, frozen section, or, most recently, by PCR-based amplification of a cytokeratin signal. Using this technique, benign epithelial inclusions, representing mammary tissue displaced along the milk line, will likely generate a positive PCR signal and lead to a false-positive diagnosis of metastatic disease. To better appreciate the incidence of ectopic epithelial inclusions in axillary lymph nodes, we have performed an autopsy study, examining on 100 μm step sections 3,904 lymph nodes obtained from 160 axillary dissections in 80 patients. The median number of lymph nodes per axilla was 23 (15, 6, and 1 in levels 1, 2, and 3, respectively). A total of 30,450 hematoxylin-eosin stained slides were examined, as well as 8,825 slides immunostained with pan-cytokeratin antibodies. Despite this meticulous work-up, not a single epithelial inclusion was found in this study, suggesting that the incidence of such inclusions is much lower than the assumed 5% reported in the literature.
Resumo:
BACKGROUND: Genes involved in arbuscular mycorrhizal (AM) symbiosis have been identified primarily by mutant screens, followed by identification of the mutated genes (forward genetics). In addition, a number of AM-related genes has been identified by their AM-related expression patterns, and their function has subsequently been elucidated by knock-down or knock-out approaches (reverse genetics). However, genes that are members of functionally redundant gene families, or genes that have a vital function and therefore result in lethal mutant phenotypes, are difficult to identify. If such genes are constitutively expressed and therefore escape differential expression analyses, they remain elusive. The goal of this study was to systematically search for AM-related genes with a bioinformatics strategy that is insensitive to these problems. The central element of our approach is based on the fact that many AM-related genes are conserved only among AM-competent species. RESULTS: Our approach involves genome-wide comparisons at the proteome level of AM-competent host species with non-mycorrhizal species. Using a clustering method we first established orthologous/paralogous relationships and subsequently identified protein clusters that contain members only of the AM-competent species. Proteins of these clusters were then analyzed in an extended set of 16 plant species and ranked based on their relatedness among AM-competent monocot and dicot species, relative to non-mycorrhizal species. In addition, we combined the information on the protein-coding sequence with gene expression data and with promoter analysis. As a result we present a list of yet uncharacterized proteins that show a strongly AM-related pattern of sequence conservation, indicating that the respective genes may have been under selection for a function in AM. Among the top candidates are three genes that encode a small family of similar receptor-like kinases that are related to the S-locus receptor kinases involved in sporophytic self-incompatibility. CONCLUSIONS: We present a new systematic strategy of gene discovery based on conservation of the protein-coding sequence that complements classical forward and reverse genetics. This strategy can be applied to diverse other biological phenomena if species with established genome sequences fall into distinguished groups that differ in a defined functional trait of interest.
Resumo:
PRINCIPLES: Respiratory care is universally recognised as useful, but its indications and practice vary markedly. In order to improve the appropriateness of respiratory care in our hospital, we developed evidence-based local guidelines in a collaborative effort involving physiotherapists, physicians and health service researchers. METHODS: Recommendations were developed using the standardised RAND appropriateness method. A literature search was conducted based on terms associated with guidelines and with respiratory care. A working group prepared proposals for recommendations which were then independently rated by a multidisciplinary expert panel. All recommendations were then discussed in common and indications for procedures were rated confidentially a second time by the experts. The recommendations were then formulated on the basis of the level of evidence in the literature and on the consensus among these experts. RESULTS: Recommendations were formulated for the following procedures: non-invasive ventilation, continuous positive airway pressure, intermittent positive pressure breathing, intrapulmonary percussive ventilation, mechanical insufflation-exsufflation, incentive spirometry, positive expiratory pressure, nasotracheal suctioning and non-instrumental airway clearance techniques. Each recommendation referred to a particular medical condition and was assigned to a hierarchical category based on the quality of the evidence from the literature supporting the recommendation and on the consensus among the experts. CONCLUSION: Despite a marked heterogeneity of scientific evidence, the method used allowed us to develop commonly agreed local guidelines for respiratory care. In addition, this work fostered a closer relationship between physiotherapists and physicians in our institution.
Resumo:
A new method of measuring joint angle using a combination of accelerometers and gyroscopes is presented. The method proposes a minimal sensor configuration with one sensor module mounted on each segment. The model is based on estimating the acceleration of the joint center of rotation by placing a pair of virtual sensors on the adjacent segments at the center of rotation. In the proposed technique, joint angles are found without the need for integration, so absolute angles can be obtained which are free from any source of drift. The model considers anatomical aspects and is personalized for each subject prior to each measurement. The method was validated by measuring knee flexion-extension angles of eight subjects, walking at three different speeds, and comparing the results with a reference motion measurement system. The results are very close to those of the reference system presenting very small errors (rms = 1.3, mean = 0.2, SD = 1.1 deg) and excellent correlation coefficients (0.997). The algorithm is able to provide joint angles in real-time, and ready for use in gait analysis. Technically, the system is portable, easily mountable, and can be used for long term monitoring without hindrance to natural activities.
Resumo:
Signal search analysis is a general method to discover and characterize sequence motifs that are positionally correlated with a functional site (e.g. a transcription or translation start site). The method has played an instrumental role in the analysis of eukaryotic promoter elements. The signal search analysis server provides access to four different computer programs as well as to a large number of precompiled functional site collections. The programs offered allow: (i) the identification of non-random sequence regions under evolutionary constraint; (ii) the detection of consensus sequence-based motifs that are over- or under-represented at a particular distance from a functional site; (iii) the analysis of the positional distribution of a consensus sequence- or weight matrix-based sequence motif around a functional site; and (iv) the optimization of a weight matrix description of a locally over-represented sequence motif. These programs can be accessed at: http://www.isrec.isb-sib.ch/ssa/.
Resumo:
Following the introduction of single-metal deposition (SMD), a simplified fingermark detection technique based on multimetal deposition, optimization studies were conducted. The different parameters of the original formula were tested and the results were evaluated based on the contrast and overall aspect of the enhanced fingermarks. The new formula for SMD was found based on the most optimized parameters. Interestingly, it was found that important variations from the base parameters did not significantly affect the outcome of the enhancement, thus demonstrating that SMD is a very robust technique. Finally, a comparison of the optimized SMD with multi-metal deposition (MMD) was carried out on different surfaces. It was demonstrated that SMD produces comparable results to MMD, thus validating the technique.
Resumo:
BACKGROUND AND PURPOSE: Multi-phase postmortem CT angiography (MPMCTA) is increasingly being recognized as a valuable adjunct medicolegal tool to explore the vascular system. Adequate interpretation, however, requires knowledge about the most common technique-related artefacts. The purpose of this study was to identify and index the possible artefacts related to MPMCTA. MATERIAL AND METHODS: An experienced radiologist blinded to all clinical and forensic data retrospectively reviewed 49 MPMCTAs. Each angiographic phase, i.e. arterial, venous and dynamic, was analysed separately to identify phase-specific artefacts based on location and aspect. RESULTS: Incomplete contrast filling of the cerebral venous system was the most commonly encountered artefact, followed by contrast agent layering in the lumen of the thoracic aorta. Enhancement or so-called oedematization of the digestive system mucosa was also frequently observed. CONCLUSION: All MPMCTA artefacts observed and described here are reproducible and easily identifiable. Knowledge about these artefacts is important to avoid misinterpreting them as pathological findings.