292 resultados para RM extended algorithm
Resumo:
BACKGROUND: Donor retention is vital to blood collection agencies. Past research has highlighted the importance of early career behavior for long-term donor retention, yet research investigating the determinants of early donor behavior is scarce. Using an extended Theory of Planned Behavior (TPB), this study sought to identify the predictors of first-time blood donors' early career retention. STUDY DESIGN AND METHODS: First-time donors (n = 256) completed three surveys on blood donation. The standard TPB predictors and self-identity as a donor were assessed 3 weeks (Time 1) and at 4 months (Time 2) after an initial donation. Path analyses examined the utility of the extended TPB to predict redonation at 4 and 8 months after initial donation. RESULTS: The extended TPB provided a good fit to the data. Post-Time 1 and 2 behavior was consistently predicted by intention to redonate. Further, intention was predicted by attitudes, perceived control, and self-identity (Times 1 and 2). Donors' intentions to redonate at Time 1 were the strongest predictor of intention to donate at Time 2, while donors' behavior at Time 1 strengthened self-identity as a blood donor at Time 2. CONCLUSION: An extended TPB framework proved efficacious in revealing the determinants of first-time donor retention in an initial 8-month period. The results suggest that collection agencies should intervene to bolster donors' attitudes, perceived control, and identity as a donor during this crucial post–first donation period.
Resumo:
In this paper, a polynomial time algorithm is presented for solving the Eden problem for graph cellular automata. The algorithm is based on our neighborhood elimination operation which removes local neighborhood configurations which cannot be used in a pre-image of a given configuration. This paper presents a detailed derivation of our algorithm from first principles, and a detailed complexity and accuracy analysis is also given. In the case of time complexity, it is shown that the average case time complexity of the algorithm is \Theta(n^2), and the best and worst cases are \Omega(n) and O(n^3) respectively. This represents a vast improvement in the upper bound over current methods, without compromising average case performance.
Resumo:
This paper elaborates the approach used by the Applied Data Mining Research Group (ADMRG) for the Social Event Detection (SED) Tasks of the 2013 MediaEval Benchmark. We extended the constrained clustering algorithm to apply to the first semi-supervised clustering task, and we compared several classifiers with Latent Dirichlet Allocation as feature selector in the second event classification task. The proposed approach focuses on scalability and efficient memory allocation when applied to a high dimensional data with large clusters. Results of the first task show the effectiveness of the proposed method. Results from task 2 indicate that attention on the imbalance categories distributions is needed.
Resumo:
An Application Specific Instruction-set Processor (ASIP) is a specialized processor tailored to run a particular application/s efficiently. However, when there are multiple candidate applications in the application’s domain it is difficult and time consuming to find optimum set of applications to be implemented. Existing ASIP design approaches perform this selection manually based on a designer’s knowledge. We help in cutting down the number of candidate applications by devising a classification method to cluster similar applications based on the special-purpose operations they share. This provides a significant reduction in the comparison overhead while resulting in customized ASIP instruction sets which can benefit a whole family of related applications. Our method gives users the ability to quantify the degree of similarity between the sets of shared operations to control the size of clusters. A case study involving twelve algorithms confirms that our approach can successfully cluster similar algorithms together based on the similarity of their component operations.
Resumo:
The paper introduces the design of robust current and voltage control algorithms for a grid-connected three-phase inverter which is interfaced to the grid through a high-bandwidth three-phase LCL filter. The algorithms are based on the state feedback control which have been designed in a systematic approach and improved by using oversampling to deal with the issues arising due to the high-bandwidth filter. An adaptive loop delay compensation method has also been adopted to minimize the adverse effects of loop delay in digital controller and to increase the robustness of the control algorithm in the presence of parameter variations. Simulation results are presented to validate the effectiveness of the proposed algorithm.
Resumo:
Multi-Objective optimization for designing of a benchmark cogeneration system known as CGAM cogeneration system has been performed. In optimization approach, the thermoeconomic and Environmental aspects have been considered, simultaneously. The environmental objective function has been defined and expressed in cost terms. One of the most suitable optimization techniques developed using a particular class of search algorithms known as; Multi-Objective Particle Swarm Optimization (MOPSO) algorithm has been used here. This approach has been applied to find the set of Pareto optimal solutions with respect to the aforementioned objective functions. An example of fuzzy decision-making with the aid of Bellman-Zadeh approach has been presented and a final optimal solution has been introduced.
Resumo:
An extended theory of planned behavior (TPB) was used to understand the factors, particularly control perceptions and affective reactions, given conflicting findings in previous research, informing younger people's intentions to join a bone marrow registry. Participants (N = 174) completed attitude, subjective norm, perceived behavioral control (PBC), moral norm, anticipated regret, self-identity, and intention items for registering. The extended TPB (except PBC) explained 67.2% of variance in intention. Further testing is needed as to the volitional nature of registering. Moral norm, anticipated regret, and self-identity are likely intervention targets for increasing younger people's bone marrow registry participation.
Resumo:
Due to the critical shortage and continued need of blood and organ donations (ODs), research exploring similarities and differences in the motivational determinants of these behaviors is needed. In a sample of 258 university students, we used a cross-sectional design to test the utility of an extended theory of planned behavior (TPB) including moral norm, self-identity and in-group altruism (family/close friends and ethnic group), to predict people’s blood and OD intentions. Overall, the extended TPB explained 77.0% and 74.6% of variance in blood and OD intentions, respectively. In regression analyses, common contributors to intentions across donation contexts were attitude, self-efficacy and self-identity. Normative influences varied with subjective norm as a significant predictor related to OD intentions but not blood donation intentions at the final step of regression analyses. Moral norm did not contribute significantly to blood or OD intentions. In-group altruism (family/close friends) was significantly related to OD intentions only in regressions. Future donation strategies should increase confidence to donate, foster a perception of self as the type of person who donates blood and/or organs, and address preferences to donate organs to in-group members only.
Resumo:
Type 2 diabetes remains an escalating world-wide problem, despite a range of treatments. The revelation that insulin secretion is under the control of a gut hormone, glucagon-like peptide 1 (GLP-1) led to a new paradigm in the management of type 2 diabetes, medicines that directly stimulate, or that prolong the actions of the endogenous GLP-1, at its receptors. Exenatide is an agonist at the GLP-1 receptors, and was initially developed as a subcutaneous twice daily medication, ExBID. The clinical trials with ExBID established a role for exenatide in the treatment of type 2 diabetes. Subsequently, once weekly exenatide (ExQW) was shown to have advantages over ExBID, and there is now more emphasis on the development of ExQW. ExQW alone reduces glycosylated haemoglobin (HbA1c) and body weight, and is well tolerated. ExQW has been compared to sitagliptin, pioglitazone and metformin, and shown to have a greater ability to reduce HbA1c than these other medicines. The only preparation of insulin, which ExQW has been compared to, is insulin glargine, and the ExQW has some favourable properties in this comparison, notably causing weight loss, compared to the gain with insulin glargine. ExQW has been compared to another GLP-1 receptor agonist, liraglutide, and ExQW is non-inferior to liraglutide in reducing HbA1c. The small amount of evidence available, shows that subjects with type 2 diabetes, prefer ExQW to ExBID, and that adherence was high to these in the clinical trial setting. Healthcare and economic modelling suggests that ExQW will reduce diabetic complications and be cost-effective, compared to other medications, with long term use. Little is known about whether subjects with type 2 diabetes prefer ExQW to other medicines, and whether adherence is good to ExQW in practice, and these important topics require further study.
Resumo:
Motivation: Gene silencing, also called RNA interference, requires reliable assessment of silencer impacts. A critical task is to find matches between silencer oligomers and sites in the genome, in accordance with one-to-many matching rules (G-U matching, with provision for mismatches). Fast search algorithms are required to support silencer impact assessments in procedures for designing effective silencer sequences.Results: The article presents a matching algorithm and data structures specialized for matching searches, including a kernel procedure that addresses a Boolean version of the database task called the skyline search. Besides exact matches, the algorithm is extended to allow for the location-specific mismatches applicable in plants. Computational tests show that the algorithm is significantly faster than suffix-tree alternatives. © The Author 2010. Published by Oxford University Press. All rights reserved.
Resumo:
In the field of rolling element bearing diagnostics envelope analysis, and in particular the squared envelope spectrum, have gained in the last years a leading role among the different digital signal processing techniques. The original constraint of constant operating speed has been relaxed thanks to the combination of this technique with the computed order tracking, able to resample signals at constant angular increments. In this way, the field of application of squared envelope spectrum has been extended to cases in which small speed fluctuations occur, maintaining the effectiveness and efficiency that characterize this successful technique. However, the constraint on speed has to be removed completely, making envelope analysis suitable also for speed and load transients, to implement an algorithm valid for all the industrial application. In fact, in many applications, the coincidence of high bearing loads, and therefore high diagnostic capability, with acceleration-deceleration phases represents a further incentive in this direction. This paper is aimed at providing and testing a procedure for the application of envelope analysis to speed transients. The effect of load variation on the proposed technique will be also qualitatively addressed.
Resumo:
In the field of diagnostics of rolling element bearings, the development of sophisticated techniques, such as Spectral Kurtosis and 2nd Order Cyclostationarity, extended the capability of expert users to identify not only the presence, but also the location of the damage in the bearing. Most of the signal-analysis methods, as the ones previously mentioned, result in a spectrum-like diagram that presents line frequencies or peaks in the neighbourhood of some theoretical characteristic frequencies, in case of damage. These frequencies depend only on damage position, bearing geometry and rotational speed. The major improvement in this field would be the development of algorithms with high degree of automation. This paper aims at this important objective, by discussing for the first time how these peaks can draw away from the theoretical expected frequencies as a function of different working conditions, i.e. speed, torque and lubrication. After providing a brief description of the peak-patterns associated with each type of damage, this paper shows the typical magnitudes of the deviations from the theoretical expected frequencies. The last part of the study presents some remarks about increasing the reliability of the automatic algorithm. The research is based on experimental data obtained by using artificially damaged bearings installed in a gearbox.
Resumo:
In the field of rolling element bearing diagnostics, envelope analysis has gained in the last years a leading role among the different digital signal processing techniques. The original constraint of constant operating speed has been relaxed thanks to the combination of this technique with the computed order tracking, able to resample signals at constant angular increments. In this way, the field of application of this technique has been extended to cases in which small speed fluctuations occur, maintaining high effectiveness and efficiency. In order to make this algorithm suitable to all industrial applications, the constraint on speed has to be removed completely. In fact, in many applications, the coincidence of high bearing loads, and therefore high diagnostic capability, with acceleration-deceleration phases represents a further incentive in this direction. This chapter presents a procedure for the application of envelope analysis to speed transients. The effect of load variation on the proposed technique will be also qualitatively addressed.
Resumo:
A multi-resource multi-stage scheduling methodology is developed to solve short-term open-pit mine production scheduling problems as a generic multi-resource multi-stage scheduling problem. It is modelled using essential characteristics of short-term mining production operations such as drilling, sampling, blasting and excavating under the capacity constraints of mining equipment at each processing stage. Based on an extended disjunctive graph model, a shifting-bottleneck-procedure algorithm is enhanced and applied to obtain feasible short-term open-pit mine production schedules and near-optimal solutions. The proposed methodology and its solution quality are verified and validated using a real mining case study.
Resumo:
As all-atom molecular dynamics method is limited by its enormous computational cost, various coarse-grained strategies have been developed to extend the length scale of soft matters in the modeling of mechanical behaviors. However, the classical thermostat algorithm in highly coarse-grained molecular dynamics method would underestimate the thermodynamic behaviors of soft matters (e.g. microfilaments in cells), which can weaken the ability of materials to overcome local energy traps in granular modeling. Based on all-atom molecular dynamics modeling of microfilament fragments (G-actin clusters), a new stochastic thermostat algorithm is developed to retain the representation of thermodynamic properties of microfilaments at extra coarse-grained level. The accuracy of this stochastic thermostat algorithm is validated by all-atom MD simulation. This new stochastic thermostat algorithm provides an efficient way to investigate the thermomechanical properties of large-scale soft matters.