91 resultados para optimize


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Optimal fault ride-through (FRT) conditions for a doubly-fed induction generator (DFIG) during a transient grid fault are analyzed with special emphasis on improving the active power generation profile. The transition states due to crowbar activation during transient faults are investigated to exploit the maximum power during the fault and post-fault period. It has been identified that operating slip, severity of fault and crowbar resistance have a direct impact on the power capability of a DFIG, and crowbar resistance can be chosen to optimize the power capability. It has been further shown that an extended crowbar period can deliver enhanced inertial response following the transient fault. The converter protection and drive train dynamics have also been analyzed while choosing the optimum crowbar resistance and delivering enhanced inertial support for an extended crowbar period.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVES: Risk stratification of Barrett's esophagus (BE) patients based on clinical and endoscopic features may help to optimize surveillance practice for esophageal adenocarcinoma (EAC) development. The aim of this study was to investigate patient symptoms and endoscopic features at index endoscopy and risk of neoplastic progression in a large population-based cohort of BE patients.

METHODS: A retrospective review of hospital records relating to incident BE diagnosis was conducted in a subset of patients with specialized intestinal metaplasia from the Northern Ireland BE register. Patients were matched to the Northern Ireland Cancer Registry to identify progressors to EAC or esophageal high-grade dysplasia (HGD). Cox proportional hazards models were applied to evaluate the association between endoscopic features, symptoms, and neoplastic progression risk.

RESULTS: During 27,997 person-years of follow-up, 128 of 3,148 BE patients progressed to develop HGD/EAC. Ulceration within the Barrett's segment, but not elsewhere in the esophagus, was associated with an increased risk of progression (hazard ratio (HR) 1.72; 95% confidence interval (CI): 1.08–2.76). Long-segment BE carried a significant sevenfold increased risk of progression compared with short-segment BE; none of the latter group developed EAC during the study period. Conversely, the absence of reflux symptoms was associated with an increased risk of cancer progression (HR 1.61; 95% CI: 1.05–2.46).

CONCLUSIONS: BE patients presenting with a long-segment BE or Barrett's ulcer have an increased risk of progressing to HGD/EAC and should be considered for more intense surveillance. The absence of reflux symptoms at BE diagnosis is not associated with a reduced risk of malignant progression, and may carry an increased risk of progression.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study presents the use of a stepped ground plane as a means to increase the gain and front-to-back ratio of an Archimedean spiral which operates in the frequency range 3–10 GHz. The backing structure is designed to optimize the antenna performance in discrete 1 GHz bands by placing each of the eight metal steps one quarter wavelength below the corresponding active regions of the spiral. Simulated and experimental results show that this type of ground plane can be designed to enhance the antenna performance over the entire 105% operating bandwidth of the spiral.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Immediate breast reconstruction after mastectomy has increased over the past decade following the unequivocal demonstration of its oncological safety and the availability of reliable methods of reconstruction. Broadly, it is undertaken in the treatment of breast cancer, after prophylactic mastectomy in high-risk patients, and in the management of treatment failure after breast-conserving surgery and radiotherapy. Immediate breast reconstruction can be achieved reliably with a variety of autogenous tissue techniques or prosthetic devices. Careful discussion and evaluation remain vital in choosing the correct technique for the individual patient.

Methods: This review is based primarily on an English language Medline search with secondary references obtained from key articles.

Results and conclusion: Immediate breast reconstruction is a safe and acceptable procedure after mastectomy for cancer; there is no evidence that it has untoward oncological consequences. In the appropriate patient it can be achieved effectively with either prosthetic or autogenous tissue reconstruction. Patient selection is important in order to optimize results, minimize complications and improve quality of life, while simultaneously treating the malignancy. Close cooperation and collaboration between the oncological breast and reconstructive achieve these objectives.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: One strategy to improve pain management in long term care (LTC) is to optimize the emerging role of the nurse practitioner (NP) in LTC. The purpose of this sub study was to learn about the NP role in implementing an onsite, interdisciplinary Pain Team in the LTC home setting.

Methods: We used a case study design that included two NPs who worked at separate LTC homes. Each of the NPs completed a weekly questionnaire of pain-related activities that they engaged in over a one-year implementation period; and a diary, using critical reflection, about their experiences and strategies used to implement the Pain Team. Descriptive statistics and thematic content analysis were used to analyze the case study data.

Findings: NPs tended to be most engaged in pain assessment and collaborated more with licensed nurses and personal support workers; less with pharmacists. NPs were more involved in organizational level activities, such as participating in committee work or assisting with the development of policies and procedures about pain. NPs created palliative care and pain service protocols; engaged in policy development, in-servicing, quality assurance and advocacy; and encouraged best practices. NPs were challenged with time constraints for pain management and balancing other role priorities and felt that increased scope of practice for them was needed.

Conclusions: The results of this study highlight how NPs implemented a Pain Team in LTC which may be helpful to others interested in implementing a similar strategy to reduce residents’ pain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, the impact of hardware impairments on secrecy performance of cognitive MIMO schemes is investigated. In addition, the relay which helps the source forward the source signal to the destination can operate either half-duplex mode or full-duplex mode. For performance evaluation, we give the expressions of average secrecy rate over Rayleigh fading channel. Monte-Carlo simulations are presented to compare and optimize the performance of the proposed schemes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mathematical modelling has become an essential tool in the design of modern catalytic systems. Emissions legislation is becoming increasingly stringent, and so mathematical models of aftertreatment systems must become more accurate in order to provide confidence that a catalyst will convert pollutants over the required range of conditions. 
Automotive catalytic converter models contain several sub-models that represent processes such as mass and heat transfer, and the rates at which the reactions proceed on the surface of the precious metal. Of these sub-models, the prediction of the surface reaction rates is by far the most challenging due to the complexity of the reaction system and the large number of gas species involved. The reaction rate sub-model uses global reaction kinetics to describe the surface reaction rate of the gas species and is based on the Langmuir Hinshelwood equation further developed by Voltz et al. [1] The reactions can be modelled using the pre-exponential and activation energies of the Arrhenius equations and the inhibition terms. 
The reaction kinetic parameters of aftertreatment models are found from experimental data, where a measured light-off curve is compared against a predicted curve produced by a mathematical model. The kinetic parameters are usually manually tuned to minimize the error between the measured and predicted data. This process is most commonly long, laborious and prone to misinterpretation due to the large number of parameters and the risk of multiple sets of parameters giving acceptable fits. Moreover, the number of coefficients increases greatly with the number of reactions. Therefore, with the growing number of reactions, the task of manually tuning the coefficients is becoming increasingly challenging. 
In the presented work, the authors have developed and implemented a multi-objective genetic algorithm to automatically optimize reaction parameters in AxiSuite®, [2] a commercial aftertreatment model. The genetic algorithm was developed and expanded from the code presented by Michalewicz et al. [3] and was linked to AxiSuite using the Simulink add-on for Matlab. 
The default kinetic values stored within the AxiSuite model were used to generate a series of light-off curves under rich conditions for a number of gas species, including CO, NO, C3H8 and C3H6. These light-off curves were used to generate an objective function. 
This objective function was used to generate a measure of fit for the kinetic parameters. The multi-objective genetic algorithm was subsequently used to search between specified limits to attempt to match the objective function. In total the pre-exponential factors and activation energies of ten reactions were simultaneously optimized. 
The results reported here demonstrate that, given accurate experimental data, the optimization algorithm is successful and robust in defining the correct kinetic parameters of a global kinetic model describing aftertreatment processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose a methodology for optimizing the execution of data parallel (sub-)tasks on CPU and GPU cores of the same heterogeneous architecture. The methodology is based on two main components: i) an analytical performance model for scheduling tasks among CPU and GPU cores, such that the global execution time of the overall data parallel pattern is optimized; and ii) an autonomic module which uses the analytical performance model to implement the data parallel computations in a completely autonomic way, requiring no programmer intervention to optimize the computation across CPU and GPU cores. The analytical performance model uses a small set of simple parameters to devise a partitioning-between CPU and GPU cores-of the tasks derived from structured data parallel patterns/algorithmic skeletons. The model takes into account both hardware related and application dependent parameters. It computes the percentage of tasks to be executed on CPU and GPU cores such that both kinds of cores are exploited and performance figures are optimized. The autonomic module, implemented in FastFlow, executes a generic map (reduce) data parallel pattern scheduling part of the tasks to the GPU and part to CPU cores so as to achieve optimal execution time. Experimental results on state-of-the-art CPU/GPU architectures are shown that assess both performance model properties and autonomic module effectiveness. © 2013 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent advances in hardware development coupled with the rapid adoption and broad applicability of cloud computing have introduced widespread heterogeneity in data centers, significantly complicating the management of cloud applications and data center resources. This paper presents the CACTOS approach to cloud infrastructure automation and optimization, which addresses heterogeneity through a combination of in-depth analysis of application behavior with insights from commercial cloud providers. The aim of the approach is threefold: to model applications and data center resources, to simulate applications and resources for planning and operation, and to optimize application deployment and resource use in an autonomic manner. The approach is based on case studies from the areas of business analytics, enterprise applications, and scientific computing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A technique for optimizing the efficiency of the sub-map method for large-scale simultaneous localization and mapping (SLAM) is proposed. It optimizes the benefits of the sub-map technique to improve the accuracy and consistency of an extended Kalman filter (EKF)-based SLAM. Error models were developed and engaged to investigate some of the outstanding issues in employing the sub-map technique in SLAM. Such issues include the size (distance) of an optimal sub-map, the acceptable error effect caused by the process noise covariance on the predictions and estimations made within a sub-map, when to terminate an existing sub-map and start a new one and the magnitude of the process noise covariance that could produce such an effect. Numerical results obtained from the study and an error-correcting process were engaged to optimize the accuracy and convergence of the Invariant Information Local Sub-map Filter previously proposed. Applying this technique to the EKF-based SLAM algorithm (a) reduces the computational burden of maintaining the global map estimates and (b) simplifies transformation complexities and data association ambiguities usually experienced in fusing sub-maps together. A Monte Carlo analysis of the system is presented as a means of demonstrating the consistency and efficacy of the proposed technique.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Failure to recruit sufficient numbers of participants to randomized controlled trials is a common and serious problem. This problem may be additionally acute in music therapy research.

Objective: To use the experience of conducting a large randomized controlled trial of music therapy for young people with emotional and behavioral difficulties to illustrate the strategies that can be used to optimize recruitment; to report on the success or otherwise of those strategies; and to draw general conclusions about the most effective approaches.

Methods: Review of the methodological literature, and a narrative account and realist analysis of the recruitment process.

Results: The strategies adopted led to the achievement of the recruitment target of 250 subjects, but only with an extension to the recruitment period. In the pre-protocol stage of the research, these strategies included the engagement of non-music therapy clinical investigators, and extensive consultation with clinical stakeholders. In the protocol development and initial recruitment stages, they involved a search of systematic reviews of factors leading to under-recruitment and of interventions to promote recruitment, and the incorporation of their insights into the research protocol and practices. In the latter stages of recruitment, various stakeholders including clinicians, senior managers and participant representatives were consulted in an attempt to uncover the reasons for the low recruitment levels that the research was experiencing.

Conclusions: The primary mechanisms to promote recruitment are education, facilitation, audit and feedback, and time allowed. The primary contextual factors affecting the effectiveness of these mechanisms are professional culture and organizational support.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The motivation for this study was to reduce physics workload relating to patient- specific quality assurance (QA). VMAT plan delivery accuracy was determined from analysis of pre- and on-treatment trajectory log files and phantom-based ionization chamber array measurements. The correlation in this combination of measurements for patient-specific QA was investigated. The relationship between delivery errors and plan complexity was investigated as a potential method to further reduce patient-specific QA workload. Thirty VMAT plans from three treatment sites - prostate only, prostate and pelvic node (PPN), and head and neck (H&N) - were retrospectively analyzed in this work. The 2D fluence delivery reconstructed from pretreatment and on-treatment trajectory log files was compared with the planned fluence using gamma analysis. Pretreatment dose delivery verification was also car- ried out using gamma analysis of ionization chamber array measurements compared with calculated doses. Pearson correlations were used to explore any relationship between trajectory log file (pretreatment and on-treatment) and ionization chamber array gamma results (pretreatment). Plan complexity was assessed using the MU/ arc and the modulation complexity score (MCS), with Pearson correlations used to examine any relationships between complexity metrics and plan delivery accu- racy. Trajectory log files were also used to further explore the accuracy of MLC and gantry positions. Pretreatment 1%/1 mm gamma passing rates for trajectory log file analysis were 99.1% (98.7%-99.2%), 99.3% (99.1%-99.5%), and 98.4% (97.3%-98.8%) (median (IQR)) for prostate, PPN, and H&N, respectively, and were significantly correlated to on-treatment trajectory log file gamma results (R = 0.989, p < 0.001). Pretreatment ionization chamber array (2%/2 mm) gamma results were also significantly correlated with on-treatment trajectory log file gamma results (R = 0.623, p < 0.001). Furthermore, all gamma results displayed a significant correlation with MCS (R > 0.57, p < 0.001), but not with MU/arc. Average MLC position and gantry angle errors were 0.001 ± 0.002 mm and 0.025° ± 0.008° over all treatment sites and were not found to affect delivery accuracy. However, vari- ability in MLC speed was found to be directly related to MLC position accuracy. The accuracy of VMAT plan delivery assessed using pretreatment trajectory log file fluence delivery and ionization chamber array measurements were strongly correlated with on-treatment trajectory log file fluence delivery. The strong corre- lation between trajectory log file and phantom-based gamma results demonstrates potential to reduce our current patient-specific QA. Additionally, insight into MLC and gantry position accuracy through trajectory log file analysis and the strong cor- relation between gamma analysis results and the MCS could also provide further methodologies to both optimize the VMAT planning and QA process. 

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A procedure was developed to extract polyols and trehalose (protectants against stress) from fungal conidia. Conidia were sonicated (120 s) and immersed in a boiling water bath (5.5 min) to optimize extraction of polyols and trehalose, respectively. A rapid method was developed to separate and detect low-molecular-weight polyols and trehalose using high-performance liquid chromatography (HPLC). An ion exchange column designed for standard carbohydrate analysis was used in preference to one designed for sugar alcohol separation. This resulted in rapid elution (less than 5 min), without sacrificing peak resolution. The use of a pulsed electrochemical detector (gold electrode) resulted in limits of reliable quantification as low as 1.6 μg ml-1 for polyols and 2.8 μg ml-1 for trehalose. This is very sensitive and rapid method by which these protectants can be analysed. It avoids polyol derivatization that characterizes analysis by gas chromatography and the long run times (up to 45 min) that typify HPLC analysis using sugar alcohol columns.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes middleware-level support for agent mobility, targeted at hierarchically structured wireless sensor and actuator network applications. Agent mobility enables a dynamic deployment and adaptation of the application on top of the wireless network at runtime, while allowing the middleware to optimize the placement of agents, e.g., to reduce wireless network traffic, transparently to the application programmer. The paper presents the design of the mechanisms and protocols employed to instantiate agents on nodes and to move agents between nodes. It also gives an evaluation of a middleware prototype running on Imote2 nodes that communicate over ZigBee. The results show that our implementation is reasonably efficient and fast enough to support the envisioned functionality on top of a commodity multi-hop wireless technology. Our work is to a large extent platform-neutral, thus it can inform the design of other systems that adopt a hierarchical structuring of mobile components. © 2012 ICST Institute for Computer Science, Social Informatics and Telecommunications Engineering.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Continuous research endeavors on hard turning (HT), both on machine tools and cutting tools, have made the previously reported daunting limits easily attainable in the modern scenario. This presents an opportunity for a systematic investigation on finding the current attainable limits of hard turning using a CNC turret lathe. Accordingly, this study aims to contribute to the existing literature by providing the latest experimental results of hard turning of AISI 4340 steel (69 HRC) using a CBN cutting tool. An orthogonal array was developed using a set of judiciously chosen cutting parameters. Subsequently, the longitudinal turning trials were carried out in accordance with a well-designed full factorial-based Taguchi matrix. The speculation indeed proved correct as a mirror finished optical quality machined surface (an average surface roughness value of 45 nm) was achieved by the conventional cutting method. Furthermore, Signal-to-noise (S/N) ratio analysis, Analysis of variance (ANOVA), and Multiple regression analysis were carried out on the experimental datasets to assert the dominance of each machining variable in dictating the machined surface roughness and to optimize the machining parameters. One of the key findings was that when feed rate during hard turning approaches very low (about 0.02mm/rev), it could alone be most significant (99.16%) parameter in influencing the machined surface roughness (Ra). This has, however also been shown that low feed rate results in high tool wear, so the selection of machining parameters for carrying out hard turning must be governed by a trade-off between the cost and quality considerations.