961 resultados para Searching, bibliographical
Resumo:
1. A more general contingency model of optimal diet choice is developed, allowing for simultaneous searching and handling, which extends the theory to include grazing and browsing by large herbivores.</p><p>2. Foraging resolves into three modes: purely encounter-limited, purely handling-limited and mixed-process, in which either a handling-limited prey type is added to an encounter-limited diet, or the diet becomes handling-limited as it expands.</p><p>3. The purely encounter-limited diet is, in general, broader than that predicted by the conventional contingency model,</p><p>4. As the degree of simultaneity of searching and handling increases, the optimal diet expands to the point where it is handling-limited, at which point all inferior prey types are rejected,</p><p>5. Inclusion of a less profitable prey species is not necessarily independent of its encounter rate and the zero-one rule does not necessarily hold: some of the less profitable prey may be included in the optimal diet. This gives an optimal foraging explanation for herbivores' mixed diets.</p><p>6. Rules are shown for calculating the boundary between encounter-limited and handling-limited diets and for predicting the proportion of inferior prey to be included in a two-species diet,</p><p>7. The digestive rate model is modified to include simultaneous searching and handling, showing that the more they overlap, the more the predicted diet-breadth is likely to be reduced.</p>
Resumo:
In this paper, a novel configurable content addressable memory (CCAM) cell is proposed, to increase the flexibility of embedded CAMs for SoC employment. It can be easily configured as a Binary CAM (BiCAM) or Ternary CAM (TCAM) without significant penalty of power consumption or searching speed. A 64x128 CCAM array has been built and verified through simulation. ©2007 IEEE.
Resumo:
Context: The development of a consolidated knowledge base for social work requires rigorous approaches to identifying relevant research. Method: The quality of 10 databases and a web search engine were appraised by systematically searching for research articles on resilience and burnout in child protection social workers. Results: Applied Social Sciences Index and Abstracts, Social Services Abstracts and Social Sciences Citation Index (SSCI) had greatest sensitivity, each retrieving more than double than any other database. PsycINFO and Cumulative Index to Nursing and Allied Health (CINAHL) had highest precision. Google Scholar had modest sensitivity and good precision in relation to the first 100 items. SSCI, Google Scholar, Medline, and CINAHL retrieved the highest number of hits not retrieved by any other database. Conclusion: A range of databases is required for even modestly comprehensive searching. Advanced database searching methods are being developed but the profession requires greater standardization of terminology to assist in information retrieval.
Resumo:
Background: Popular approaches in human tissue-based biomarker discovery include tissue microarrays (TMAs) and DNA Microarrays (DMAs) for protein and gene expression profiling respectively. The data generated by these analytic platforms, together with associated image, clinical and pathological data currently reside on widely different information platforms, making searching and cross-platform analysis difficult. Consequently, there is a strong need to develop a single coherent database capable of correlating all available data types.
Method: This study presents TMAX, a database system to facilitate biomarker discovery tasks. TMAX organises a variety of biomarker discovery-related data into the database. Both TMA and DMA experimental data are integrated in TMAX and connected through common DNA/protein biomarkers. Patient clinical data (including tissue pathological data), computer assisted tissue image and associated analytic data are also included in TMAX to enable the truly high throughput processing of ultra-large digital slides for both TMAs and whole slide tissue digital slides. A comprehensive web front-end was built with embedded XML parser software and predefined SQL queries to enable rapid data exchange in the form of standard XML files.
Results & Conclusion: TMAX represents one of the first attempts to integrate TMA data with public gene expression experiment data. Experiments suggest that TMAX is robust in managing large quantities of data from different sources (clinical, TMA, DMA and image analysis). Its web front-end is user friendly, easy to use, and most importantly allows the rapid and easy data exchange of biomarker discovery related data. In conclusion, TMAX is a robust biomarker discovery data repository and research tool, which opens up the opportunities for biomarker discovery and further integromics research.
Resumo:
BACKGROUND: Overuse of unnecessary medications in frail older adults with limited life expectancy remains an understudied challenge. OBJECTIVE: To identify intervention studies that reduced use of unnecessary medications in frail older adults. A secondary goal was to identify and review studies focusing on patients approaching end of life. We examined criteria for identifying unnecessary medications, intervention processes for medication reduction, and intervention effectiveness. METHODS: A systematic review of English articles using MEDLINE, EMBASE, and International Pharmaceutical Abstracts from January 1966 to September 2012. Additional studies were identified by searching bibliographies. Search terms included prescription drugs, drug utilization, hospice or palliative care, and appropriate or inappropriate. A manual review of 971 identified abstracts for the inclusion criteria (study included an intervention to reduce chronic medication use; at least 5 participants; population included patients aged at least 65 years, hospice enrollment, or indication of frailty or risk of functional decline-including assisted living or nursing home residence, inpatient hospitalization) yielded 60 articles for full review by 3 investigators. After exclusion of review articles, interventions targeting acute medications, or studies exclusively in the intensive care unit, 36 articles were retained (including 13 identified by bibliography review). Articles were extracted for study design, study setting, intervention description, criteria for identifying unnecessary medication use, and intervention outcomes. RESULTS: The studies included 15 randomized controlled trials, 4 non-randomized trials, 6 pre-post studies, and 11 case series. Control groups were used in over half of the studies (n = 20). Study populations varied and included residents of nursing homes and assisted living facilities (n = 16), hospitalized patients (n = 14), hospice/palliative care patients (n = 3), home care patients (n = 2), and frail or disabled community-dwelling patients (n = 1). The majority of studies (n = 21) used implicit criteria to identify unnecessary medications (including drugs without indication, unnecessary duplication, and lack of effectiveness); only one study incorporated patient preference into prescribing criteria. Most (25) interventions were led by or involved pharmacists, 4 used academic detailing, 2 used audit and feedback reports targeting prescribers, and 5 involved physician-led medication reviews. Overall intervention effect sizes could not be determined due to heterogeneity of study designs, samples, and measures. CONCLUSIONS: Very little rigorous research has been conducted on reducing unnecessary medications in frail older adults or patients approaching end of life.
Resumo:
Introduction: Amplicon deep-sequencing using second-generation sequencing technology is an innovative molecular diagnostic technique and enables a highly-sensitive detection of mutations. As an international consortium we had investigated previously the robustness, precision, and reproducibility of 454 amplicon next-generation sequencing (NGS) across 10 laboratories from 8 countries (Leukemia, 2011;25:1840-8).
Aims: In Phase II of the study, we established distinct working groups for various hematological malignancies, i.e. acute myeloid leukemia (AML), acute lymphoblastic leukemia (ALL), chronic lymphocytic leukemia (CLL), chronic myelogenous leukemia (CML), myelodysplastic syndromes (MDS), myeloproliferative neoplasms (MPN), and multiple myeloma. Currently, 27 laboratories from 13 countries are part of this research consortium. In total, 74 gene targets were selected by the working groups and amplicons were developed for a NGS deep-sequencing assay (454 Life Sciences, Branford, CT). A data analysis pipeline was developed to standardize mutation interpretation both for accessing raw data (Roche Amplicon Variant Analyzer, 454 Life Sciences) and variant interpretation (Sequence Pilot, JSI Medical Systems, Kippenheim, Germany).
Results: We will report on the design, standardization, quality control aspects, landscape of mutations, as well as the prognostic and predictive utility of this assay in a cohort of 8,867 cases. Overall, 1,146 primer sequences were designed and tested. In detail, for example in AML, 924 cases had been screened for CEBPA mutations. RUNX1 mutations were analyzed in 1,888 cases applying the deep-sequencing read counts to study the stability of such mutations at relapse and their utility as a biomarker to detect residual disease. Analyses of DNMT3A (n=1,041) were focused to perform landscape investigations and to address the prognostic relevance. Additionally, this working group is focusing on TET2, ASXL1, and TP53 analyses. A novel prognostic model is being developed allowing stratification of AML into prognostic subgroups based on molecular markers only. In ALL, 1,124 pediatric and adult cases have been screened, including 763 assays for TP53 mutations both at diagnosis and relapse of ALL. Pediatric and adult leukemia expert labs developed additional content to study the mutation incidence of other B and T lineage markers such as IKZF1, JAK2, IL7R, PAX5, EP300, LEF1, CRLF2, PHF6, WT1, JAK1, PTEN, AKT1, IL7R, NOTCH1, CREBBP, or FBXW7. Further, the molecular landscape of CLL is changing rapidly. As such, a separate working group focused on analyses including NOTCH1, SF3B1, MYD88, XPO1, FBXW7 and BIRC3. Currently, 922 cases were screened to investigate the range of mutational burden of NOTCH1 mutations for their prognostic relevance. In MDS, RUNX1 mutation analyses were performed in 977 cases. The prognostic relevance of TP53 mutations in MDS was assessed in additional 327 cases, including isolated deletions of chromosome 5q. Next, content was developed targeting genes of the cellular splicing component, e.g. SF3B1, SRSF2, U2AF1, and ZRSR2. In BCR-ABL1-negative MPN, nine genes of interest (JAK2, MPL, TET2, CBL, KRAS, EZH2, IDH1, IDH2, ASXL1) have been analyzed in a cohort of 155 primary myelofibrosis cases searching for novel somatic mutations and addressing their relevance for disease progression and leukemia transformation. Moreover, an assay was developed and applied to CMML cases allowing the simultaneous analysis of 25 leukemia-associated target genes in a single sequencing run using just 20 ng of starting DNA. Finally, nine laboratories are studying CML, applying ultra-deep sequencing of the BCR-ABL1 tyrosine kinase domain. Analyses were performed on 615 cases investigating the dynamics of expansion of mutated clones under various tyrosine kinase inhibitor therapies.
Conclusion: Molecular characterization of hematological malignancies today requires high diagnostic sensitivity and specificity. As part of the IRON-II study, a network of laboratories analyzed a variety of disease entities applying amplicon-based NGS assays. Importantly, the consortium not only standardized assay design for disease-specific panels, but also achieved consensus on a common data analysis pipeline for mutation interpretation. Distinct working groups have been forged to address scientific tasks and in total 8,867 cases had been analyzed thus far.
Resumo:
Conflicting results have been reported on the detection of paramyxovirus transcripts in Paget's disease, and a possible explanation is differences in the sensitivity of RT-PCR methods for detecting virus. In a blinded study, we found no evidence to suggest that laboratories that failed to detect viral transcripts had less sensitive RT-PCR assays, and we did not detect measles or distemper transcripts in Paget's samples using the most sensitive assays evaluated.
Introduction: There is conflicting evidence on the possible role of persistent paramyxovirus infection in Paget's disease of bone (PDB). Some workers have detected measles virus (MV) or canine distemper virus (CDV) transcripts in cells and tissues from patients with PDB, but others have failed to confirm this finding. A possible explanation might be differences in the sensitivity of RT-PCR methods for detecting virus. Here we performed a blinded comparison of the sensitivity of different RT-PCR-based techniques for MV and CDV detection in different laboratories and used the most sensitive assays to screen for evidence of viral transcripts in bone and blood samples derived from patients with PDB.
Materials and Methods: Participating laboratories analyzed samples spiked with known amounts of MV and CDV transcripts and control samples that did not contain viral nucleic acids. All analyses were performed on a blinded basis.
Results: The limit of detection for CDV was 1000 viral transcripts in three laboratories (Aberdeen, Belfast, and Liverpool) and 10,000 transcripts in another laboratory (Manchester). The limit of detection for MV was 16 transcripts in one laboratory (NIBSC), 1000 transcripts in two laboratories (Aberdeen and Belfast), and 10,000 transcripts in two laboratories (Liverpool and Manchester). An assay previously used by a U.S.-based group to detect MV transcripts in PDB had a sensitivity of 1000 transcripts. One laboratory (Manchester) detected CDV transcripts in a negative control and in two samples that had been spiked with MV. None of the other laboratories had false-positive results for MV or CDV, and no evidence of viral transcripts was found on analysis of 12 PDB samples using the most sensitive RT-PCR assays for MV and CDV.
Conclusions: We found that RT-PCR assays used by different laboratories differed in their sensitivity to detect CDV and MV transcripts but found no evidence to suggest that laboratories that previously failed to detect viral transcripts had less sensitive RT-PCR assays than those that detected viral transcripts. False-positive results were observed with one laboratory, and we failed to detect paramyxovirus transcripts in PDB samples using the most sensitive assays evaluated. Our results show that failure of some laboratories to detect viral transcripts is unlikely to be caused by problems with assay sensitivity and highlight the fact that contamination can be an issue when searching for pathogens by sensitive RT-PCR-based techniques.
Resumo:
Aim: To evaluate the quality of reporting of all diagnostic studies published in five major ophthalmic journals in the year 2002 using the Standards for Reporting of Diagnostic Accuracy (STARD) initiative parameters. Methods: Manual searching was used to identify diagnostic studies published in 2002 in five leading ophthalmic journals, the American Journal of Ophthalmology (AJO), Archives of Ophthalmology (Archives), British Journal of Ophthalmology (BJO), Investigative Ophthalmology and Visual Science (IOVS), and Ophthalmology. The STARD checklist of 25 items and flow chart was used to evaluate the quality of each publication. Results: A total of 16 publications were included (AJO = 5, Archives = 1, BJO = 2, IOVS = 2, and Ophthalmology = 6). More than half of the studies (n = 9) were related to glaucoma diagnosis. Other specialties included retina (n = 4) cornea (n = 2), and neuro-ophthalmology (n = 1). The most common description of diagnostic accuracy was sensitivity and specificity values, published in 13 articles. The number of fully reported items in evaluated studies ranged from eight to 19. Seven studies reported more than 50% of the STARD items. Conclusions: The current standards of reporting of diagnostic accuracy tests are highly variable. The STARD initiative may be a useful tool for appraising the strengths and weaknesses of diagnostic accuracy studies.
Resumo:
Background:Mechanical ventilation is a critical component of paediatric intensive care therapy. It is indicated when the patient’s spontaneous ventilation is inadequate to sustain life. Weaning is the gradual reduction of ventilatory support and the transfer of respiratory control back to the patient. Weaning may represent a large proportion of the ventilatory period. Prolonged ventilation is associated with significant morbidity, hospital cost, psychosocial and physical risks to the child and even death. Timely and effective weaning may reduce the duration of mechanical ventilation and may reduce the morbidity and mortality associated with prolonged ventilation. However, no consensus has been reached on criteria that can be used to identify when patients are ready to wean or the best way to achieve it.Objectives:To assess the effects of weaning by protocol on invasively ventilated critically ill children. To compare the total duration of invasive mechanical ventilation of critically ill children who are weaned using protocols versus those weaned through usual (non-protocolized) practice. To ascertain any differences between protocolized weaning and usual care in terms of mortality, adverse events, intensive care unit length of stay and quality of life.Search methods:We searched the Cochrane Central Register of Controlled Trials (CENTRAL; The Cochrane Library, Issue 10, 2012), MEDLINE (1966 to October 2012), EMBASE (1988 to October 2012), CINAHL (1982 to October 2012), ISI Web of Science and LILACS. We identified unpublished data in the Web of Science (1990 to October 2012), ISI Conference Proceedings (1990 to October 2012) and Cambridge Scientific Abstracts (earliest to October 2012). We contacted first authors of studies included in the review to obtain further information on unpublished studies or work in progress. We searched reference lists of all identified studies and review papers for further relevant studies. We applied no language or publication restrictions.Selection criteriaWe included randomized controlled trials comparing protocolized weaning (professional-led or computer-driven) versus non-protocolized weaning practice conducted in children older than 28 days and younger than 18 years.Data collection and analysis:Two review authors independently scanned titles and abstracts identified by electronic searching. Three review authors retrieved and evaluated full-text versions of potentially relevant studies, independently extracted data and assessed risk of bias.Main results:We included three trials at low risk of bias with 321 children in the analysis. Protocolized weaning significantly reduced total ventilation time in the largest trial (260 children) by a mean of 32 hours (95% confidence interval (CI) 8 to 56; P = 0.01). Two other trials (30 and 31 children, respectively) reported non-significant reductions with a mean difference of -88 hours (95% CI -228 to 52; P = 0.2) and -24 hours (95% CI -10 to 58; P = 0.06). Protocolized weaning significantly reduced weaning time in these two smaller trials for a mean reduction of 106 hours (95% CI 28 to 184; P = 0.007) and 21 hours (95% CI 9 to 32; P < 0.001). These studies reported no significant effects for duration of mechanical ventilation before weaning, paediatric intensive care unit (PICU) and hospital length of stay, PICU mortality or adverse events.Authors' conclusions:Limited evidence suggests that weaning protocols reduce the duration of mechanical ventilation, but evidence is inadequate to show whether the achievement of shorter ventilation by protocolized weaning causes children benefit or harm.
Resumo:
Stellar activity, such as starspots, can induce radial velocity (RV) variations that can mask or even mimic the RV signature of orbiting exoplanets. For this reason RV exoplanet surveys have been unsuccessful when searching for planets around young, active stars and are therefore failing to explore an important regime which can help to reveal how planets form and migrate. This paper describes a new technique to remove spot signatures from the stellar line-profiles of moderately rotating, active stars (v sin i ranging from 10 to 50 km s(-1)). By doing so it allows planetary RV signals to be uncovered. We used simulated models of a G5V type star with differing dark spots on its surface along with archive data of the known active star HD 49933 to validate our method. The results showed that starspots could be effectively cleaned from the line-profiles so that the stellar RV jitter was reduced by more than 80 per cent. Applying this procedure to the same models and HD 49933 data, but with fake planets injected, enabled the effective removal of starspots so that Jupiter mass planets on short orbital periods were successfully recovered. These results show that this approach can be useful in the search for hot-Jupiter planets that orbit around young, active stars with a v sin i of similar to 10-50 km/s.
Resumo:
A practical machine-vision-based system is developed for fast detection of defects occurring on the surface of bottle caps. This system can be used to extract the circular region as the region of interests (ROI) from the surface of a bottle cap, and then use the circular region projection histogram (CRPH) as the matching features. We establish two dictionaries for the template and possible defect, respectively. Due to the requirements of high-speed production as well as detecting quality, a fast algorithm based on a sparse representation is proposed to speed up the searching. In the sparse representation, non-zero elements in the sparse factors indicate the defect's size and position. Experimental results in industrial trials show that the proposed method outperforms the orientation code method (OCM) and is able to produce promising results for detecting defects on the surface of bottle caps.
Resumo:
African coastal regions are expected to experience the highest rates of population growth in coming decades. Fresh groundwater resources in the coastal zone of East Africa (EA) are highly vulnerable to seawater intrusion. Increasing water demand is leading to unsustainable and ill-planned well drilling and abstraction. Wells supplying domestic, industrial and agricultural needs are or have become, in many areas, too saline for use. Climate change, including weather changes and sea level rise, is expected to exacerbate this problem. The multiplicity of physical, demographic and socio-economic driving factors makes this a very challenging issue for management. At present the state and probable evolution of coastal aquifers in EA are not well documented. The UPGro project 'Towards groundwater security in coastal East Africa' brings together teams from Kenya, Tanzania, Comoros Islands and Europe to address this knowledge gap. An integrative multidisciplinary approach, combining the expertise of hydrogeologists, hydrologists and social scientists, is investigating selected sites along the coastal zone in each country. Hydrogeologic observatories have been established in different geologic and climatic settings representative of the coastal EA region, where focussed research will identify the current status of groundwater and identify future threats based on projected demographic and climate change scenarios. Researchers are also engaging with end users as well as local community and stakeholder groups in each area in order to understanding the issues most affecting the communities and searching sustainable strategies for addressing these.
Resumo:
Supported decision making (SDM) refers to the process of supporting people, whose decision making ability may be impaired, to make decisions and so promote autonomy and prevent the need for substitute decision making. There have been developments in SDM but mainly in the areas of intellectual disabilities and end-of-life care rather than in mental health. The main aim of this review was to provide an overview of the available evidence relevant to SDM and so facilitate discussion of how this aspect of law, policy and practice may be further developed in mental health services. The method used for this review was a Rapid Evidence Assessment which involved: developing appropriate search strategies; searching relevant databases and grey literature; then assessing, including and reviewing relevant studies. Included studies were grouped into four main themes: studies reporting stakeholders’ views on SDM; studies identifying barriers to the implementation of SDM; studies highlighting ways to improve implementation; and studies on the impact of SDM. The available evidence on implementation and impact, identified by this review, is limited but there are important rights-based, effectiveness and pragmatic arguments for further developing and researching SDM for people with mental health problems.
Resumo:
Geomorphology plays a critical role in two areas of geoforensics: searching the land for surface or buried objects and sampling or imaging rural scenes of crime and control locations as evidence. Most of the associated geoscience disciplines have substantial bodies of work dedicated to their relevance in forensic investigations, yet geomorphology (specifically landforms, their mapping and evolution, soils and relationship to geology and biogeography) have had no such exposure. This is strange considering how fundamental to legal enquiries the location of a crime and its evolution are, as this article will demonstrate. This work aims to redress the balance by showing how geomorphology is featured in one of the earliest works on forensic science methods, and has continued to play a role in the sociology, archaeology, criminalistics and geoforensics of crime. The application geomorphology has in military/humanitarian geography and environmental/engineering forensics is briefly discussed as these are also regularly reviewed in courts of law
Resumo:
We describe the Pan-STARRS Moving Object Processing System (MOPS), a modern software package that produces automatic asteroid discoveries and identifications from catalogs of transient detections from next-generation astronomical survey telescopes. MOPS achieves >99.5% efficiency in producing orbits from a synthetic but realistic population of asteroids whose measurements were simulated for a Pan-STARRS4-class telescope. Additionally, using a nonphysical grid population, we demonstrate that MOPS can detect populations of currently unknown objects such as interstellar asteroids. MOPS has been adapted successfully to the prototype Pan-STARRS1 telescope despite differences in expected false detection rates, fill-factor loss, and relatively sparse observing cadence compared to a hypothetical Pan-STARRS4 telescope and survey. MOPS remains highly efficient at detecting objects but drops to 80% efficiency at producing orbits. This loss is primarily due to configurable MOPS processing limits that are not yet tuned for the Pan-STARRS1 mission. The core MOPS software package is the product of more than 15 person-years of software development and incorporates countless additional years of effort in third-party software to perform lower-level functions such as spatial searching or orbit determination. We describe the high-level design of MOPS and essential subcomponents, the suitability of MOPS for other survey programs, and suggest a road map for future MOPS development.