840 resultados para hybrid prediction method
Resumo:
Introduction
Mild cognitive impairment (MCI) has clinical value in its ability to predict later dementia. A better understanding of cognitive profiles can further help delineate who is most at risk of conversion to dementia. We aimed to (1) examine to what extent the usual MCI subtyping using core criteria corresponds to empirically defined clusters of patients (latent profile analysis [LPA] of continuous neuropsychological data) and (2) compare the two methods of subtyping memory clinic participants in their prediction of conversion to dementia.
Methods
Memory clinic participants (MCI, n = 139) and age-matched controls (n = 98) were recruited. Participants had a full cognitive assessment, and results were grouped (1) according to traditional MCI subtypes and (2) using LPA. MCI participants were followed over approximately 2 years after their initial assessment to monitor for conversion to dementia.
Results
Groups were well matched for age and education. Controls performed significantly better than MCI participants on all cognitive measures. With the traditional analysis, most MCI participants were in the amnestic multidomain subgroup (46.8%) and this group was most at risk of conversion to dementia (63%). From the LPA, a three-profile solution fit the data best. Profile 3 was the largest group (40.3%), the most cognitively impaired, and most at risk of conversion to dementia (68% of the group).
Discussion
LPA provides a useful adjunct in delineating MCI participants most at risk of conversion to dementia and adds confidence to standard categories of clinical inference.
Resumo:
The Arc-Length Method is a solution procedure that enables a generic non-linear problem to pass limit points. Some examples are provided of mode-jumping problems solutions using a commercial nite element package, and other investigations are carried out on a simple structure of which the numerical solution can be compared with an analytical one. It is shown that Arc-Length Method is not reliable when bifurcations are present in the primary equilibrium path; also the presence of very sharp snap-backs or special boundary conditions may cause convergence diÆculty at limit points. An improvement to the predictor used in the incremental procedure is suggested, together with a reliable criteria for selecting either solution of the quadratic arc-length constraint. The gap that is sometimes observed between the experimantal load level of mode-jumping and its arc-length prediction is explained through an example.
Resumo:
This paper is concerned with the application of an automated hybrid approach in addressing the university timetabling problem. The approach described is based on the nature-inspired artificial bee colony (ABC) algorithm. An ABC algorithm is a biologically-inspired optimization approach, which has been widely implemented in solving a range of optimization problems in recent years such as job shop scheduling and machine timetabling problems. Although the approach has proven to be robust across a range of problems, it is acknowledged within the literature that there currently exist a number of inefficiencies regarding the exploration and exploitation abilities. These inefficiencies can often lead to a slow convergence speed within the search process. Hence, this paper introduces a variant of the algorithm which utilizes a global best model inspired from particle swarm optimization to enhance the global exploration ability while hybridizing with the great deluge (GD) algorithm in order to improve the local exploitation ability. Using this approach, an effective balance between exploration and exploitation is attained. In addition, a traditional local search approach is incorporated within the GD algorithm with the aim of further enhancing the performance of the overall hybrid method. To evaluate the performance of the proposed approach, two diverse university timetabling datasets are investigated, i.e., Carter's examination timetabling and Socha course timetabling datasets. It should be noted that both problems have differing complexity and different solution landscapes. Experimental results demonstrate that the proposed method is capable of producing high quality solutions across both these benchmark problems, showing a good degree of generality in the approach. Moreover, the proposed method produces best results on some instances as compared with other approaches presented in the literature.
Resumo:
Generating timetables for an institution is a challenging and time consuming task due to different demands on the overall structure of the timetable. In this paper, a new hybrid method which is a combination of a great deluge and artificial bee colony algorithm (INMGD-ABC) is proposed to address the university timetabling problem. Artificial bee colony algorithm (ABC) is a population based method that has been introduced in recent years and has proven successful in solving various optimization problems effectively. However, as with many search based approaches, there exist weaknesses in the exploration and exploitation abilities which tend to induce slow convergence of the overall search process. Therefore, hybridization is proposed to compensate for the identified weaknesses of the ABC. Also, inspired from imperialist competitive algorithms, an assimilation policy is implemented in order to improve the global exploration ability of the ABC algorithm. In addition, Nelder–Mead simplex search method is incorporated within the great deluge algorithm (NMGD) with the aim of enhancing the exploitation ability of the hybrid method in fine-tuning the problem search region. The proposed method is tested on two differing benchmark datasets i.e. examination and course timetabling datasets. A statistical analysis t-test has been conducted and shows the performance of the proposed approach as significantly better than basic ABC algorithm. Finally, the experimental results are compared against state-of-the art methods in the literature, with results obtained that are competitive and in certain cases achieving some of the current best results to those in the literature.
Resumo:
In this paper, we present a hybrid mixed cost-function adaptive initialization algorithm for the time domain equalizer in a discrete multitone (DMT)-based asymmetric digital subscriber loop. Using our approach, a higher convergence rate than that of the commonly used least-mean square algorithm is obtained, whilst attaining bit rates close to the optimum maximum shortening SNR and the upper bound SNR. Moreover, our proposed method outperforms the minimum mean-squared error design for a range of TEQ filter lengths.
Resumo:
The separation of enantiomers and confirmation of their absolute configurations is significant in the development of chiral drugs. The interactions between the enantiomers of chiral pyrazole derivative and polysaccharide-based chiral stationary phase cellulose tris(4-methylbenzoate) (Chiralcel OJ) in seven solvents and under different temperature were studied using molecular dynamics simulations. The results show that solvent effect has remarkable influence on the interactions. Structure analysis discloses that the different interactions between two isomers and chiral stationary phase are dependent on the nature of solvents, which may invert the elution order. The computational method in the present study can be used to predict the elution order and the absolute configurations of enantiomers in HPLC separations and therefore would be valuable in development of chiral drugs.
Resumo:
Natural mineral-water interface reactions drive ecosystem/global fluoride (F−) cycling. These small-scale processes prove challenging to monitoring due to mobilization being highly localized and variable; influenced by changing climate, hydrology, dissolution chemistries and pedogenosis. These release events could be captured in situ by the passive sampling technique, diffusive gradients in thin-films (DGT), providing a cost-effective and time-integrated measurement of F− mobilization. However, attempts to develop the method for F− have been unsuccessful due to the very restrictive operational ranges that most F−-absorbents function within. A new hybrid-DGT technique for F− quantification containing a three-phase fine particle composite (Fesingle bondAlsingle bondCe, FAC) adsorbent was developed and evaluated. Sampler response was validated in laboratory and field deployments, passing solution chemistry QC within ionic strength and pH ranges of 0–200 mmol L−1 and 4.3–9.1, respectively, and exhibiting high sorption capacities (98 ± 8 μg cm−2). FAC-DGT measurements adequately predicted up to weeklong averaged in situ F− fluvial fluxes in a freshwater river and F− concentrations in a wastewater treatment flume determined by high frequency active sampling. While, millimetre-scale diffusive fluxes across the sediment-water interface were modeled for three contrasting lake bed sediments from a F−-enriched lake using the new FAC-DGT platform.
Resumo:
Laser transmission joining (LTJ) is growing in importance, and has the potential to become a niche technique for the fabrication of hybrid plastic-metal joints for medical device applications. The possibility of directly joining plastics to metals by LTJ has been demonstrated by a number of recent studies. However, a reliable and quantitative method for defining the contact area between the plastic and metal, facilitating calculation of the mechanical shear stress of the hybrid joints, is still lacking. A new method, based on image analysis using ImageJ, is proposed here to quantify the contact area at the joint interface. The effect of discolouration on the mechanical performance of the hybrid joints is also reported for the first time. Biocompatible polyethylene terephthalate (PET) and commercially pure titanium (Ti) were selected as materials for laser joining using a 200 W CW fibre laser system. The effect of laser power, scanning speed and stand-off distance between the nozzle tip and top surface of the plastic were studied and analysed by Taguchi L9 orthogonal array and ANOVA respectively. The surface morphology, structure and elemental composition on the PET and Ti surfaces after shearing/peeling apart were characterized by SEM, EDX, XRD and XPS.
Resumo:
In this thesis, 2,2’-bipyridine (bipy), di-tert-butyl-2,2’-bipyridine (di-t-Bubipy), 2,2’-bipyridine-5,5’-dicarboxylic acid (H2bpdc), 2-[3(5)-pyrazolyl]pyridine (pzpy) and 2-(1-pentyl-3-pyrazolyl)pyridine (pent-pp) ligands were used as the N,N-chelate ligands in the formation of discrete [MoO2Cl2L]-type complexes. These complexes were employed as precursors for the preparation in aqueous media of oxomolybdenum(VI) products with a wide range of structural diversity. Three distinct heating methods were studied: hydrothermal, reflux or microwave-assisted synthesis. An alternative reaction with the inorganic molybdenum(VI) trioxide (MoO3) and the ligands di-t-Bu-bipy, H2bpdc and pzpy was also investigated under hydrothermal conditions. The distinct nature of the N,N-chelate ligands and/or the heating method employed promoted the isolation of a series of new oxomolybdenum(VI) hybrid materials that clearly reflected the strong structure-directing influence of these ligands. Thus, this thesis describes the synthesis and characterization of the discrete mononuclear [MoO2Cl2(pent-pp)], the dinuclear [Mo2O6(di-t-Bu-bipy)2] and the octanuclear [Mo8O22(OH)4(di-t-Bu-bipy)4] complexes as well as the highly unique polymeric materials {[MoO3(bipy)][MoO3(H2O)]}n, (DMA)[MoO3(Hbpdc)]·nH2O, [Mo3O9(pzpy)]n and [Mo2O6(pent-pp)]n (fine structural details of compound [Mo2O6(pent-pp)]n are presently unknown; however, characterization data strongly pointed toward a polymeric oxide hybrid compound). The catalytic behaviour of the discrete complexes and the polymeric compounds was tested in olefin epoxidation reactions. Compounds [Mo3O9(pzpy)]n and [Mo2O6(pent-pp)]n acted as sources of soluble active species that where identified as the oxodiperoxido complexes [MoO(O2)2(pzpy)] and [MoO(O2)2(pent-pp)], respectively. The majority of the compounds here presented were fully characterized by using solid-state techniques, namely elemental analyses, thermogravimetry, FT-IR, solid-state NMR, electron microscopy and powder X-ray diffraction (both from laboratory and/or synchrotron sources).
Resumo:
Tese de doutoramento, Engenharia Electrónica e Telecomunicações (Processamento de Sinal), Faculdade de Ciências e Tecnologia, Universidade do Algarve, 2014
Resumo:
Tese de doutoramento, Informática (Bioinformática), Universidade de Lisboa, Faculdade de Ciências, 2014
Resumo:
Adequate user authentication is a persistent problem, particularly with mobile devices, which tend to be highly personal and at the fringes of an organisation's influence. Yet these devices are being used increasingly in various business settings, where they pose a risk to security and privacy, not only from sensitive information they may contain, but also from the means they typically offer to access such information over wireless networks. User authentication is the first line of defence for a mobile device that falls into the hands of an unauthorised user. However, motivating users to enable simple password mechanisms and periodically update their authentication information is difficult at best. This paper examines some of the issues relating to the use of biometrics as a viable method of authentication on mobile wireless devices. It is also a critical analysis of some of the techniques currently employed and where appropriate, suggests novel hybrid ways in which they could be improved or modified. Both biometric technology and wireless setting based constraints that determine the feasibility and the performance of the authentication feature are specified. Some well known biometric technologies are briefly reviewed and their feasibility for wireless and mobile use is reviewed. Furthermore, a number of quantitative and qualitative parameters for evaluation are also presented. Biometric technologies are continuously advancing toward commercial implementation in wireless devices. When carefully designed and implemented, the advantage of biometric authentication arises mainly from increased convenience and coexistent improved security.
Resumo:
This paper introduces a novel method of estimating theFourier transform of deterministic continuous-time signals from a finite number N of their nonuniformly spaced measurements. These samples, located at a mixture of deterministic and random time instants, are collected at sub-Nyquist rates since no constraints are imposed on either the bandwidth or the spectral support of the processed signal. It is shown that the proposed estimation approach converges uniformly for all frequencies at the rate N^−5 or faster. This implies that it significantly outperforms its alias-free-sampling-based predecessors, namely stratified and antithetical stratified estimates, which are shown to uniformly convergence at a rate of N^−1. Simulations are presented to demonstrate the superior performance and low complexity of the introduced technique.
Resumo:
This paper proposes a new methodology to reduce the probability of occurring states that cause load curtailment, while minimizing the involved costs to achieve that reduction. The methodology is supported by a hybrid method based on Fuzzy Set and Monte Carlo Simulation to catch both randomness and fuzziness of component outage parameters of transmission power system. The novelty of this research work consists in proposing two fundamentals approaches: 1) a global steady approach which deals with building the model of a faulted transmission power system aiming at minimizing the unavailability corresponding to each faulted component in transmission power system. This, results in the minimal global cost investment for the faulted components in a system states sample of the transmission network; 2) a dynamic iterative approach that checks individually the investment’s effect on the transmission network. A case study using the Reliability Test System (RTS) 1996 IEEE 24 Buses is presented to illustrate in detail the application of the proposed methodology.
Resumo:
Many current e-commerce systems provide personalization when their content is shown to users. In this sense, recommender systems make personalized suggestions and provide information of items available in the system. Nowadays, there is a vast amount of methods, including data mining techniques that can be employed for personalization in recommender systems. However, these methods are still quite vulnerable to some limitations and shortcomings related to recommender environment. In order to deal with some of them, in this work we implement a recommendation methodology in a recommender system for tourism, where classification based on association is applied. Classification based on association methods, also named associative classification methods, consist of an alternative data mining technique, which combines concepts from classification and association in order to allow association rules to be employed in a prediction context. The proposed methodology was evaluated in some case studies, where we could verify that it is able to shorten limitations presented in recommender systems and to enhance recommendation quality.