83 resultados para searching
Resumo:
We consider the local order estimation of nonlinear autoregressive systems with exogenous inputs (NARX), which may have different local dimensions at different points. By minimizing the kernel-based local information criterion introduced in this paper, the strongly consistent estimates for the local orders of the NARX system at points of interest are obtained. The modification of the criterion and a simple procedure of searching the minimum of the criterion, are also discussed. The theoretical results derived here are tested by simulation examples.
Resumo:
This study combined high resolution mass spectrometry (HRMS), advanced chemometrics and pathway enrichment analysis to analyse the blood metabolome of patients attending the memory clinic: cases of mild cognitive impairment (MCI; n = 16), cases of MCI who upon subsequent follow-up developed Alzheimer's disease (MCI_AD; n = 19), and healthy age-matched controls (Ctrl; n = 37). Plasma was extracted in acetonitrile and applied to an Acquity UPLC HILIC (1.7μm x 2.1 x 100 mm) column coupled to a Xevo G2 QTof mass spectrometer using a previously optimised method. Data comprising 6751 spectral features were used to build an OPLS-DA statistical model capable of accurately distinguishing Ctrl, MCI and MCI_AD. The model accurately distinguished (R2 = 99.1%; Q2 = 97%) those MCI patients who later went on to develop AD. S-plots were used to shortlist ions of interest which were responsible for explaining the maximum amount of variation between patient groups. Metabolite database searching and pathway enrichment analysis indicated disturbances in 22 biochemical pathways, and excitingly it discovered two interlinked areas of metabolism (polyamine metabolism and L-Arginine metabolism) were differentially disrupted in this well-defined clinical cohort. The optimised untargeted HRMS methods described herein not only demonstrate that it is possible to distinguish these pathologies in human blood but also that MCI patients 'at risk' from AD could be predicted up to 2 years earlier than conventional clinical diagnosis. Blood-based metabolite profiling of plasma from memory clinic patients is a novel and feasible approach in improving MCI and AD diagnosis and, refining clinical trials through better patient stratification.
Resumo:
Inspired by the commercial application of the Exechon machine, this paper proposed a novel parallel kinematic machine (PKM) named Exe-Variant. By exchanging the sequence of kinematic pairs in each limb of the Exechon machine, the Exe-Variant PKM claims an arrangement of 2UPR/1SPR topology and consists of two identical UPR limbs and one SPR limb. The inverse kinematics of the 2UPR/1SPR parallel mechanism was firstly analyzed based on which a conceptual design of the Exe-Variant was carried out. Then an algorithm of reachable workspace searching for the Exe-Variant and the Exchon was proposed. Finally, the workspaces of two example systems of the Exechon and the Exe-Variant with approximate dimensions were numerically simulated and compared. The comparison shows that the Exe-Variant possesses a competitive workspace with the Exechon machine, indicating it can be used as a promising reconfigurable module in a hybrid 5-DOF machine tool system.
Resumo:
Background: The COMET (Core Outcome Measures in Effectiveness Trials) Initiative is developing a publicly accessible online resource to collate the knowledge base for core outcome set development (COS) and the applied work from different health conditions. Ensuring that the database is as comprehensive as possible and keeping it up to date are key to its value for users. This requires the development and application of an optimal, multi-faceted search strategy to identify relevant material. This paper describes the challenges of designing and implementing such a search, outlining the development of the search strategy for studies of COS development, and, in turn, the process for establishing a database of COS.
Methods: We investigated the performance characteristics of this strategy including sensitivity, precision and numbers needed to read. We compared the contribution of databases towards identifying included studies to identify the best combination of methods to retrieve all included studies.
Results: Recall of the search strategies ranged from 4% to 87%, and precision from 0.77% to 1.13%. MEDLINE performed best in terms of recall, retrieving 216 (87%) of the 250 included records, followed by Scopus (44%). The Cochrane Methodology Register found just 4% of the included records. MEDLINE was also the database with the highest precision. The number needed to read varied between 89 (MEDLINE) and 130 (SCOPUS).
Conclusions: We found that two databases and hand searching were required to locate all of the studies in this review. MEDLINE alone retrieved 87% of the included studies, but actually 97% of the included studies were indexed on MEDLINE. The Cochrane Methodology Register did not contribute any records that were not found in the other databases, and will not be included in our future searches to identify studies developing COS. SCOPUS had the lowest precision rate (0.77) and highest number needed to read (130). In future COMET searches for COS a balance needs to be struck between the work involved in screening large numbers of records, the frequency of the searching and the likelihood that eligible studies will be identified by means other than the database searches.
Resumo:
PURPOSE: Understanding the experience of late effects from the perspective of cancer survivors is essential to inform patient-centred care. This study investigated the nature and onset of late effects experienced by survivors and the manner in which late effects have affected their lives.
METHODS: Sixteen purposively selected cancer survivors participated in a qualitative interview study. The data were analysed inductively using a narrative schema in order to derive the main themes that characterised patients' accounts of late effects.
RESULTS: Individual survivors tended to experience more than one late effect spanning a range of physical and psychological effects. Late effects impacted on relationships, working life, finances and the ability to undertake daily activities. Survivors reported experiencing psychological late effects from around the end of treatment whereas the onset of physical effects occurred later during the post-treatment period. Late effects were managed using formal health services, informal social support and use of 'wellbeing strategies'. Survivors engaged in a process of searching for reasons for experiencing late effects and struggled to make sense of their situation. In particular, a process of 'peer-patient comparison' was used by survivors to help them make sense of, or cope with, their late effects. There appeared to be an association between personal disposition and adaptation and adjustment to the impact of late effects.
CONCLUSIONS: Cancer survivors identified potential components for supported self-management or intervention programmes, as well as important considerations in terms of peer comparisons, personal disposition and making sense of experienced late effects.
Resumo:
Mathematical models are useful tools for simulation, evaluation, optimal operation and control of solar cells and proton exchange membrane fuel cells (PEMFCs). To identify the model parameters of these two type of cells efficiently, a biogeography-based optimization algorithm with mutation strategies (BBO-M) is proposed. The BBO-M uses the structure of biogeography-based optimization algorithm (BBO), and both the mutation motivated from the differential evolution (DE) algorithm and the chaos theory are incorporated into the BBO structure for improving the global searching capability of the algorithm. Numerical experiments have been conducted on ten benchmark functions with 50 dimensions, and the results show that BBO-M can produce solutions of high quality and has fast convergence rate. Then, the proposed BBO-M is applied to the model parameter estimation of the two type of cells. The experimental results clearly demonstrate the power of the proposed BBO-M in estimating model parameters of both solar and fuel cells.
Resumo:
Clean and renewable energy generation and supply has drawn much attention worldwide in recent years, the proton exchange membrane (PEM) fuel cells and solar cells are among the most popular technologies. Accurately modeling the PEM fuel cells as well as solar cells is critical in their applications, and this involves the identification and optimization of model parameters. This is however challenging due to the highly nonlinear and complex nature of the models. In particular for PEM fuel cells, the model has to be optimized under different operation conditions, thus making the solution space extremely complex. In this paper, an improved and simplified teaching-learning based optimization algorithm (STLBO) is proposed to identify and optimize parameters for these two types of cell models. This is achieved by introducing an elite strategy to improve the quality of population and a local search is employed to further enhance the performance of the global best solution. To improve the diversity of the local search a chaotic map is also introduced. Compared with the basic TLBO, the structure of the proposed algorithm is much simplified and the searching ability is significantly enhanced. The performance of the proposed STLBO is firstly tested and verified on two low dimension decomposable problems and twelve large scale benchmark functions, then on the parameter identification of PEM fuel cell as well as solar cell models. Intensive experimental simulations show that the proposed STLBO exhibits excellent performance in terms of the accuracy and speed, in comparison with those reported in the literature.
Resumo:
Some reasons for registering trials might be considered as self-serving, such as satisfying the requirements of a journal in which the researchers wish to publish their eventual findings or publicising the trial to boost recruitment. Registry entries also help others, including systematic reviewers, to know about ongoing or unpublished studies and contribute to reducing research waste by making it clear what studies are ongoing. Other sources of research waste include inconsistency in outcome measurement across trials in the same area, missing data on important outcomes from some trials, and selective reporting of outcomes. One way to reduce this waste is through the use of core outcome sets: standardised sets of outcomes for research in specific areas of health and social care. These do not restrict the outcomes that will be measured, but provide the minimum to include if a trial is to be of the most use to potential users. We propose that trial registries, such as ISRCTN, encourage researchers to note their use of a core outcome set in their entry. This will help people searching for trials and those worried about selective reporting in closed trials. Trial registries can facilitate these efforts to make new trials as useful as possible and reduce waste. The outcomes section in the entry could prompt the researcher to consider using a core outcome set and facilitate the specification of that core outcome set and its component outcomes through linking to the original core outcome set. In doing this, registries will contribute to the global effort to ensure that trials answer important uncertainties, can be brought together in systematic reviews, and better serve their ultimate aim of improving health and well-being through improving health and social care.
Resumo:
OBJECTIVES: Identify the words and phrases that authors used to describe time-to-event outcomes of dental treatments in patients.
MATERIALS AND METHODS: A systematic handsearch of 50 dental journals with the highest Citation Index for 2008 identified articles reporting dental treatment with time-to-event statistics (included "case" articles, n = 95), without time-to-event statistics (active "control" articles, n = 91), and all other articles (passive "control" articles n = 6796). The included and active controls were read, identifying 43 English words across the title, aim and abstract, indicating that outcomes were studied over time. Once identified, these words were sought within the 6796 passive controls. Words were divided into six groups. Differences in use of words were analyzed with Pearson's chi-square across these six groups, and the three locations (title, aim, and abstract).
RESULTS: In the abstracts, included articles used group 1 (statistical technique) and group 2 (statistical terms) more frequently than the active and passive controls (group 1: 35%, 2%, 0.37%, P < 0.001 and group 2: 31%, 1%, 0.06%, P < 0.001). The included and active controls used group 3 (quasi-statistical) equally, but significantly more often than the passive controls (82%, 78%, 3.21%, P < 0.001). In the aims, use of target words was similar for included and active controls, but less frequent for groups 1-4 in the passive controls (P < 0.001). In the title, group 2 (statistical techniques) and groups 3-5 (outcomes) were similar for included and active controls, but groups 2 and 3 were less frequent in the passive controls (P < 0.001). Significantly more included articles used group 6 words (stating the study duration) (54%, 30%, P = 0.001).
CONCLUSION: All included articles used time-to-event analyses, but two-thirds did not include words to highlight this in the abstract. There is great variation in the words authors used to describe dental time-to-event outcomes. Electronic identification of such articles would be inconsistent, with low sensitivity and specificity. Authors should improve the reporting quality. Journals should allow sufficient space in abstracts to summarize research, and not impose unrealistic word limits. Readers should be mindful of these problems when searching for relevant articles. Additional research is required in this field.
Resumo:
Pancreatic adenocarcinoma is the fourth leading cause of cancer death and has an extremely poor prognosis: The 5-year survival probability is less than 5% for all stages. The only chance for cure or longer survival is surgical resection; however, only 10% to 20% of patients have resectable disease. Although surgical techniques have improved, most who undergo complete resection experience a recurrence. Adjuvant systemic therapy reduces the recurrence rate and improves outcomes. There is a potential role for radiation therapy as part of treatment for locally advanced disease, although its use in both the adjuvant and neoadjuvant settings remains controversial. Palliative systemic treatment is the only option for patients with metastatic disease. To date, however, only the gemcitabine plus erlotinib combination, and recently the FOLFIRINOX regimen, have been associated with relatively small but statistically significant improvements in OS when compared directly with gemcitabine alone. Although several meta-analyses have suggested a benefit associated with combination chemotherapy, whether this benefit is clinically meaningful remains unclear, particularly in light of the enhanced toxicity associated with combination regimens. There is growing evidence that the exceptionally poor prognosis in PC is caused by the tumor's characteristic abundant desmoplastic stroma that plays a critical role in tumor cell growth, invasion, metastasis, and chemoresistance. Carefully designed clinical trials that include translational analysis will provide a better understanding of the tumor biology and its relation to the host stromal cells. Future directions will involve testing of new targeted agents, understanding the pharmacodynamics of our current targeted agents, searching for predictive and prognostic biomarkers, and exploring the efficacy of different combinations strategies.
Resumo:
Environmental problems, especially climate change, have become a serious global issue waiting for people to solve. In the construction industry, the concept of sustainable building is developing to reduce greenhouse gas emissions. In this study, a building information modeling (BIM) based building design optimization method is proposed to facilitate designers to optimize their designs and improve buildings’ sustainability. A revised particle swarm optimization (PSO) algorithm is applied to search for the trade-off between life cycle costs (LCC) and life cycle carbon emissions (LCCE) of building designs. In order tovalidate the effectiveness and efficiency of this method, a case study of an office building is conducted in Hong Kong. The result of the case study shows that this method can enlarge the searching space for optimal design solutions and shorten the processing time for optimal design results, which is really helpful for designers to deliver an economic and environmental friendly design scheme.
Resumo:
Objective: To summarise the findings of an updated Cochrane review of interventions aimed at improving the appropriate use of polypharmacy in older people. Design: Cochrane systematic review. Multiple electronic databases were searched including MEDLINE, EMBASE and the Cochrane Central Register of Controlled Trials (from inception to November 2013). Hand searching of references was also performed. Randomised controlled trials (RCTs), controlled clinical trials, controlled before-and-after studies and interrupted time series analyses reporting on interventions targeting appropriate polypharmacy in older people in any healthcare setting were included if they used a validated measure of prescribing appropriateness. Evidence quality was assessed using the Cochrane risk of bias tool and GRADE (Grades of Recommendation, Assessment, Development and Evaluation).
Setting: All healthcare settings.
Participants: Older people (≥65 years) with ≥1 long-term condition who were receiving polypharmacy (≥4 regular medicines).
Primary and secondary outcome measures: Primary outcomes were the change in prevalence of appropriate polypharmacy and hospital admissions. Medication-related problems (eg, adverse drug reactions), medication adherence and quality of life were included as secondary outcomes.
Results: 12 studies were included: 8 RCTs, 2 cluster RCTs and 2 controlled before-and-after studies. 1 study involved computerised decision support and 11 comprised pharmaceutical care approaches across various settings. Appropriateness was measured using validated tools, including the Medication Appropriateness Index, Beers’ criteria and Screening Tool of Older Person’s Prescriptions (STOPP)/ Screening Tool to Alert doctors to Right Treatment (START). The interventions demonstrated a reduction in inappropriate prescribing. Evidence of effect on hospital admissions and medication-related problems was conflicting. No differences in health-related quality of life were reported.
Conclusions: The included interventions demonstrated improvements in appropriate polypharmacy based on reductions in inappropriate prescribing. However, it remains unclear if interventions resulted in clinically significant improvements (eg, in terms of hospital admissions). Future intervention studies would benefit from available guidance on intervention development, evaluation and reporting to facilitate replication in clinical practice.
Resumo:
We investigate the cell coverage optimization problem for the massive multiple-input multiple-output (MIMO) uplink. By deploying tilt-adjustable antenna arrays at the base stations, cell coverage optimization can become a promising technique which is able to strike a compromise between covering cell-edge users and pilot contamination suppression. We formulate a detailed description of this optimization problem by maximizing the cell throughput, which is shown to be mainly determined by the user distribution within several key geometrical regions. Then, the formulated problem is applied to different example scenarios: for a network with hexagonal shaped cells and uniformly distributed users, we derive an analytical lower bound of the ergodic throughput in the objective cell, based on which, it is shown that the optimal choice for the cell coverage should ensure that the coverage of different cells does not overlap; for a more generic network with sectoral shaped cells and non-uniformly distributed users, we propose an analytical approximation of the ergodic throughput. After that, a practical coverage optimization algorithm is proposed, where the optimal solution can be easily obtained through a simple one-dimensional line searching within a confined searching region. Our numerical results show that the proposed coverage optimization method is able to greatly increase the system throughput in macrocells for the massive MIMO uplink transmission, compared with the traditional schemes where the cell coverage is fixed.
Resumo:
Single component geochemical maps are the most basic representation of spatial elemental distributions and commonly used in environmental and exploration geochemistry. However, the compositional nature of geochemical data imposes several limitations on how the data should be presented. The problems relate to the constant sum problem (closure), and the inherently multivariate relative information conveyed by compositional data. Well known is, for instance, the tendency of all heavy metals to show lower values in soils with significant contributions of diluting elements (e.g., the quartz dilution effect); or the contrary effect, apparent enrichment in many elements due to removal of potassium during weathering. The validity of classical single component maps is thus investigated, and reasonable alternatives that honour the compositional character of geochemical concentrations are presented. The first recommended such method relies on knowledge-driven log-ratios, chosen to highlight certain geochemical relations or to filter known artefacts (e.g. dilution with SiO2 or volatiles). This is similar to the classical normalisation approach to a single element. The second approach uses the (so called) log-contrasts, that employ suitable statistical methods (such as classification techniques, regression analysis, principal component analysis, clustering of variables, etc.) to extract potentially interesting geochemical summaries. The caution from this work is that if a compositional approach is not used, it becomes difficult to guarantee that any identified pattern, trend or anomaly is not an artefact of the constant sum constraint. In summary the authors recommend a chain of enquiry that involves searching for the appropriate statistical method that can answer the required geological or geochemical question whilst maintaining the integrity of the compositional nature of the data. The required log-ratio transformations should be applied followed by the chosen statistical method. Interpreting the results may require a closer working relationship between statisticians, data analysts and geochemists.
Resumo:
Background: Large-scale biological jobs on high-performance computing systems require manual intervention if one or more computing cores on which they execute fail. This places not only a cost on the maintenance of the job, but also a cost on the time taken for reinstating the job and the risk of losing data and execution accomplished by the job before it failed. Approaches which can proactively detect computing core failures and take action to relocate the computing core's job onto reliable cores can make a significant step towards automating fault tolerance. Method: This paper describes an experimental investigation into the use of multi-agent approaches for fault tolerance. Two approaches are studied, the first at the job level and the second at the core level. The approaches are investigated for single core failure scenarios that can occur in the execution of parallel reduction algorithms on computer clusters. A third approach is proposed that incorporates multi-agent technology both at the job and core level. Experiments are pursued in the context of genome searching, a popular computational biology application. Result: The key conclusion is that the approaches proposed are feasible for automating fault tolerance in high-performance computing systems with minimal human intervention. In a typical experiment in which the fault tolerance is studied, centralised and decentralised checkpointing approaches on an average add 90% to the actual time for executing the job. On the other hand, in the same experiment the multi-agent approaches add only 10% to the overall execution time