946 resultados para minimum order observers


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, the minimum-order stable recursive filter design problem is proposed and investigated. This problem is playing an important role in pipeline implementation sin signal processing. Here, the existence of a high-order stable recursive filter is proved theoretically, in which the upper bound for the highest order of stable filters is given. Then the minimum-order stable linear predictor is obtained via solving an optimization problem. In this paper, the popular genetic algorithm approach is adopted since it is a heuristic probabilistic optimization technique and has been widely used in engineering designs. Finally, an illustrative example is sued to show the effectiveness of the proposed algorithm.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nonlinear stability theorems are presented for axisymmetric vortices under the restriction that the disturbance is independent of either the azimuthal or the axial coordinate. These stability theorems are then used, in both cases, to derive rigorous upper bounds on the saturation amplitudes of instabilities. Explicit examples of such bounds are worked out for some canonical profiles. The results establish a minimum order for the dependence of saturation amplitude on supercriticality, and are thereby suggestive as to the nature of the bifurcation at the stability threshold.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The effective supplier evaluation and purchasing processes are of vital importance to business organizations, making the suppliers selection problem a fundamental key issue to their success. We consider a complex supplier selection problem with multiple products where minimum package quantities, minimum order values related to delivery costs, and discounted pricing schemes are taken into account. Our main contribution is to present a mixed integer linear programming (MILP) model for this supplier selection problem. The model is used to solve several examples including three real case studies from an electronic equipment assembly company.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective-To evaluate the effects of increasing doses of remifentanil hydrochloride administered via constant rate infusion (CRI) on the minimum alveolar concentration (MAC) of isoflurane in cats. Animals-6 healthy adult cats. Procedures-For each cat, 2 experiments were performed (2-week interval). On each study day, anesthesia was induced and maintained with isoflurane; a catheter was placed in a cephalic vein for the administration of lactated Ringer`s solution or remifentanil CRIs, and a catheter was placed in the jugular vein for collection of blood samples for blood gas analyses. On the first study day, individual basal MAC (MAC(Basal)) was determined for each cat. On the second study day, 3 remifentanil CRIs (0.25, 0.5, and 1.0 mu g/kg/min) were administered (in ascending order); for each infusion, at least 30 minutes elapsed before determination of MAC (designated as MAC(R0.25`) MAC(R0.5`) and MACR(R1.0`) respectively). A 15-minute washout period was allowed between CRIs. A control MAC (MAC Control) was determined after the last remifentanil infusion. Results-Mean +/- SD MAC(Basal) and MAC(Control) values at sea level did not differ significantly (1.66 +/- 0.08% and 1.52 +/- 0.21%, respectively). The MAC values determined for each remifentanil CRI did not differ significantly. However, MACR(0.25`) MAC(R0.5`) and MAC(R1.0) were significantly decreased, compared with MAC(Basal`) by 23.4 +/- 79%, 29.8 +/- 8.3%, and 26.0 +/- 9.4%, respectively. Conclusions and Clinical Relevance-The 3 doses of remifentanil administered via CRI resulted in a similar degree of isoflurane MAC reduction in adult cats, indicating that a ceiling effect was achieved following administration of the lowest dose. (Am J Vet Res 2009;70:581-588)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

If the Internet could be used as a method of transmitting ultrasound images taken in the field quickly and effectively, it would bring tertiary consultation to even extremely remote centres. The aim of the study was to evaluate the maximum degree of compression of fetal ultrasound video-recordings that would not compromise signal quality. A digital fetal ultrasound videorecording of 90 s was produced, resulting in a file size of 512 MByte. The file was compressed to 2, 5 and 10 MByte. The recordings were viewed by a panel of four experienced observers who were blinded to the compression ratio used. Using a simple seven-point scoring system, the observers rated the quality of the clip on 17 items. The maximum compression ratio that was considered clinically acceptable was found to be 1:50-1:100. This produced final file sizes of 5-10 MByte, corresponding to a screen size of 320 x 240 pixels, running at 15 frames/s. This study expands the possibilities for providing tertiary perinatal services to the wider community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we propose an extension of the firefly algorithm (FA) to multi-objective optimization. FA is a swarm intelligence optimization algorithm inspired by the flashing behavior of fireflies at night that is capable of computing global solutions to continuous optimization problems. Our proposal relies on a fitness assignment scheme that gives lower fitness values to the positions of fireflies that correspond to non-dominated points with smaller aggregation of objective function distances to the minimum values. Furthermore, FA randomness is based on the spread metric to reduce the gaps between consecutive non-dominated solutions. The obtained results from the preliminary computational experiments show that our proposal gives a dense and well distributed approximated Pareto front with a large number of points.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Revolutionary endovascular treatments are on the verge of being available for management of ascending aortic diseases. Morphometric measurements of the ascending aorta have already been done with ECG-gated MDCT to help such therapeutic development. However the reliability of these measurements remains unknown. The objective of this work was to compare the intraobserver and interobserver variability of CAD (computer aided diagnosis) versus manual measurements in the ascending aorta. Methods and materials: Twenty-six consecutive patients referred for ECG-gated CT thoracic angiography (64-row CT scanner) were evaluated. Measurements of the maximum and minimum ascending aorta diameters at mid-distance between the brachiocephalic artery and the aortic valve were obtained automatically with a commercially available CAD and manually by two observers separately. Both observers repeated the measurements during a different session at least one month after the first measurements. Intraclass coefficients as well the Bland and Altman method were used for comparison between measurements. Two-paired t-test was used to determine the significance of intraobserver and interobserver differences (alpha = 0.05). Results: There is a significant difference between CAD and manual measurements in the maximum diameter (p = 0.004) for the first observer, whereas the difference was significant for minimum diameter between the second observer and the CAD (p <0.001). Interobserver variability showed a weak agreement when measurements were done manually. Intraobserver variability was lower with the CAD compared to the manual measurements (limits of variability: from -0.7 to 0.9 mm for the former and from -1.2 to 1.3 mm for the latter). Conclusion: In order to improve reproductibility of measurements whenever needed, pre- and post-therapeutic management of the ascending aorta may benefit from follow-up done by a unique observer with the help of CAD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 2009, the Sheffield Alcohol Research Group (SARG) at Sheffield University developed the Sheffield Alcohol Policy Model version 2.0 (SAPM) to appraise the potential impact of alcohol policies, including different levels of MUP, for the population of England. In 2013, SARG were commissioned by the DHSSPS and the Department for Social Development to adapt the Sheffield Model to NI in order to appraise the potential impact of a range of alcohol pricing policies. The present report represents the results of this work. Estimates from the Northern Ireland (NI) adaptation of the Sheffield Alcohol Policy Model - version 3 - (SAPM3) suggest: 1. Minimum Unit Pricing (MUP) policies would be effective in reducing alcohol consumption, alcohol related harms (including alcohol-related deaths, hospitalisations, crimes and workplace absences) and the costs associated with those harms. 2. A ban on below-cost selling (implemented as a ban on selling alcohol for below the cost of duty plus the VAT payable on that duty) would have a negligible impact on alcohol consumption or related harms. 3. A ban on price-based promotions in the off-trade, either alone or in tandem with an MUP policy would be effective in reducing alcohol consumption, related harms and associated costs. 4. MUP and promotion ban policies would only have a small impact on moderate drinkers at all levels of income. Somewhat larger impacts would be experienced by increasing risk drinkers, with the most substantial effects being experienced by high risk drinkers. 5. MUP and promotion ban policies would have larger impacts on those in poverty, particularly high risk drinkers, than those not in poverty. However, those in poverty also experience larger relative gains in health and are estimated to marginally reduce their spending due to their reduced drinking under the majority of policies åÊ

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Coronary magnetic resonance angiography (MRA) is a medical imaging technique that involves collecting data from consecutive heartbeats, always at the same time in the cardiac cycle, in order to minimize heart motion artifacts. This technique relies on the assumption that coronary arteries always follow the same trajectory from heartbeat to heartbeat. Until now, choosing the acquisition window in the cardiac cycle was based exclusively on the position of minimal coronary motion. The goal of this study was to test the hypothesis that there are time intervals during the cardiac cycle when coronary beat-to-beat repositioning is optimal. The repositioning uncertainty values in these time intervals were then compared with the intervals of low coronary motion in order to propose an optimal acquisition window for coronary MRA. Methods: Cine breath-hold x-ray angiograms with synchronous ECG were collected from 11 patients who underwent elective routine diagnostic coronarography. Twenty-three bifurcations of the left coronary artery were selected as markers to evaluate repositioning uncertainty and velocity during cardiac cycle. Each bifurcation was tracked by two observers, with the help of a user-assisted algorithm implemented in Matlab (The Mathworks, Natick, MA, USA) that compared the trajectories of the markers coming from consecutive heartbeats and computed the coronary repositioning uncertainty with steps of 50ms until 650ms after the R-wave. Repositioning uncertainty was defined as the diameter of the smallest circle encompassing the points to be compared at the same time after the R-wave. Student's t-tests with a false discovery rate (FDR, q=0.1) correction for multiple comparison were applied to see whether coronary repositioning and velocity vary statistically during cardiac cycle. Bland-Altman plots and linear regression were used to assess intra- and inter-observer agreement. Results: The analysis of left coronary artery beat-to-beat repositioning uncertainty shows a tendency to have better repositioning in mid systole (less than 0.84±0.58mm) and mid diastole (less than 0.89±0.6mm) than in the rest of the cardiac cycle (highest value at 50ms=1.35±0.64mm). According to Student's t-tests with FDR correction for multiple comparison (q=0.1), two intervals, in mid systole (150-200ms) and mid diastole (550-600ms), provide statistically better repositioning in comparison with the early systole and the early diastole. Coronary velocity analysis reveals that left coronary artery moves more slowly in end systole (14.35±11.35mm/s at 225ms) and mid diastole (11.78±11.62mm/s at 625ms) than in the rest of the cardiac cycle (highest value at 25ms: 55.96±22.34mm/s). This was confirmed by Student's t-tests with FDR correction for multiple comparison (q=0.1, FDR-corrected p-value=0.054): coronary velocity values at 225, 575 and 625ms are not much different between them but they are statistically inferior to all others. Bland-Altman plots and linear regression show that intra-observer agreement (y=0.97x+0.02 with R²=0.93 at 150ms) is better than inter-observer (y=0.8x+0.11 with R²=0.67 at 150ms). Discussion: The present study has demonstrated that there are two time intervals in the cardiac cycle, one in mid systole and one in mid diastole, where left coronary artery repositioning uncertainty reaches points of local minima. It has also been calculated that the velocity is the lowest in end systole and mid diastole. Since systole is less influenced by heart rate variability than diastole, it was finally proposed to test an acquisition window between 150 and 200ms after the R-wave.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the most effective techniques offering QoS routing is minimum interference routing. However, it is complex in terms of computation time and is not oriented toward improving the network protection level. In order to include better levels of protection, new minimum interference routing algorithms are necessary. Minimizing the failure recovery time is also a complex process involving different failure recovery phases. Some of these phases depend completely on correct routing selection, such as minimizing the failure notification time. The level of protection also involves other aspects, such as the amount of resources used. In this case shared backup techniques should be considered. Therefore, minimum interference techniques should also be modified in order to include sharing resources for protection in their objectives. These aspects are reviewed and analyzed in this article, and a new proposal combining minimum interference with fast protection using shared segment backups is introduced. Results show that our proposed method improves both minimization of the request rejection ratio and the percentage of bandwidth allocated to backup paths in networks with low and medium protection requirements

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The behaviour of the harmonic infrared frequency of diatomic molecules subjected to moderate static uniform electric fields is analysed. The potential energy expression has been developed as a function of a static uniform electric field, which brings about a formulation describing the frequency versus field strength curve. With the help of the first and second derivatives of the expressions obtained, which correspond to the first- and second-order Stark effects, it was possible to find the maxima of the frequency versus field strength curves for a series of molecules using a Newton-Raphson search. A method is proposed which requires only the calculation of a few energy derivatives at a particular value of the field strength. At the same time, the expression for the dependence of the interatomic distance on the electric field strength is derived and the minimum of this curve is found for the same species. Derived expressions and numerical results are discussed and compared with other studi

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La tomodensitométrie (CT) est une technique d'imagerie dont l'intérêt n'a cessé de croître depuis son apparition dans le début des années 70. Dans le domaine médical, son utilisation est incontournable à tel point que ce système d'imagerie pourrait être amené à devenir victime de son succès si son impact au niveau de l'exposition de la population ne fait pas l'objet d'une attention particulière. Bien évidemment, l'augmentation du nombre d'examens CT a permis d'améliorer la prise en charge des patients ou a rendu certaines procédures moins invasives. Toutefois, pour assurer que le compromis risque - bénéfice soit toujours en faveur du patient, il est nécessaire d'éviter de délivrer des doses non utiles au diagnostic.¦Si cette action est importante chez l'adulte elle doit être une priorité lorsque les examens se font chez l'enfant, en particulier lorsque l'on suit des pathologies qui nécessitent plusieurs examens CT au cours de la vie du patient. En effet, les enfants et jeunes adultes sont plus radiosensibles. De plus, leur espérance de vie étant supérieure à celle de l'adulte, ils présentent un risque accru de développer un cancer radio-induit dont la phase de latence peut être supérieure à vingt ans. Partant du principe que chaque examen radiologique est justifié, il devient dès lors nécessaire d'optimiser les protocoles d'acquisitions pour s'assurer que le patient ne soit pas irradié inutilement. L'avancée technologique au niveau du CT est très rapide et depuis 2009, de nouvelles techniques de reconstructions d'images, dites itératives, ont été introduites afin de réduire la dose et améliorer la qualité d'image.¦Le présent travail a pour objectif de déterminer le potentiel des reconstructions itératives statistiques pour réduire au minimum les doses délivrées lors d'examens CT chez l'enfant et le jeune adulte tout en conservant une qualité d'image permettant le diagnostic, ceci afin de proposer des protocoles optimisés.¦L'optimisation d'un protocole d'examen CT nécessite de pouvoir évaluer la dose délivrée et la qualité d'image utile au diagnostic. Alors que la dose est estimée au moyen d'indices CT (CTDIV0| et DLP), ce travail a la particularité d'utiliser deux approches radicalement différentes pour évaluer la qualité d'image. La première approche dite « physique », se base sur le calcul de métriques physiques (SD, MTF, NPS, etc.) mesurées dans des conditions bien définies, le plus souvent sur fantômes. Bien que cette démarche soit limitée car elle n'intègre pas la perception des radiologues, elle permet de caractériser de manière rapide et simple certaines propriétés d'une image. La seconde approche, dite « clinique », est basée sur l'évaluation de structures anatomiques (critères diagnostiques) présentes sur les images de patients. Des radiologues, impliqués dans l'étape d'évaluation, doivent qualifier la qualité des structures d'un point de vue diagnostique en utilisant une échelle de notation simple. Cette approche, lourde à mettre en place, a l'avantage d'être proche du travail du radiologue et peut être considérée comme méthode de référence.¦Parmi les principaux résultats de ce travail, il a été montré que les algorithmes itératifs statistiques étudiés en clinique (ASIR?, VEO?) ont un important potentiel pour réduire la dose au CT (jusqu'à-90%). Cependant, par leur fonctionnement, ils modifient l'apparence de l'image en entraînant un changement de texture qui pourrait affecter la qualité du diagnostic. En comparant les résultats fournis par les approches « clinique » et « physique », il a été montré que ce changement de texture se traduit par une modification du spectre fréquentiel du bruit dont l'analyse permet d'anticiper ou d'éviter une perte diagnostique. Ce travail montre également que l'intégration de ces nouvelles techniques de reconstruction en clinique ne peut se faire de manière simple sur la base de protocoles utilisant des reconstructions classiques. Les conclusions de ce travail ainsi que les outils développés pourront également guider de futures études dans le domaine de la qualité d'image, comme par exemple, l'analyse de textures ou la modélisation d'observateurs pour le CT.¦-¦Computed tomography (CT) is an imaging technique in which interest has been growing since it first began to be used in the early 1970s. In the clinical environment, this imaging system has emerged as the gold standard modality because of its high sensitivity in producing accurate diagnostic images. However, even if a direct benefit to patient healthcare is attributed to CT, the dramatic increase of the number of CT examinations performed has raised concerns about the potential negative effects of ionizing radiation on the population. To insure a benefit - risk that works in favor of a patient, it is important to balance image quality and dose in order to avoid unnecessary patient exposure.¦If this balance is important for adults, it should be an absolute priority for children undergoing CT examinations, especially for patients suffering from diseases requiring several follow-up examinations over the patient's lifetime. Indeed, children and young adults are more sensitive to ionizing radiation and have an extended life span in comparison to adults. For this population, the risk of developing cancer, whose latency period exceeds 20 years, is significantly higher than for adults. Assuming that each patient examination is justified, it then becomes a priority to optimize CT acquisition protocols in order to minimize the delivered dose to the patient. Over the past few years, CT advances have been developing at a rapid pace. Since 2009, new iterative image reconstruction techniques, called statistical iterative reconstructions, have been introduced in order to decrease patient exposure and improve image quality.¦The goal of the present work was to determine the potential of statistical iterative reconstructions to reduce dose as much as possible without compromising image quality and maintain diagnosis of children and young adult examinations.¦The optimization step requires the evaluation of the delivered dose and image quality useful to perform diagnosis. While the dose is estimated using CT indices (CTDIV0| and DLP), the particularity of this research was to use two radically different approaches to evaluate image quality. The first approach, called the "physical approach", computed physical metrics (SD, MTF, NPS, etc.) measured on phantoms in well-known conditions. Although this technique has some limitations because it does not take radiologist perspective into account, it enables the physical characterization of image properties in a simple and timely way. The second approach, called the "clinical approach", was based on the evaluation of anatomical structures (diagnostic criteria) present on patient images. Radiologists, involved in the assessment step, were asked to score image quality of structures for diagnostic purposes using a simple rating scale. This approach is relatively complicated to implement and also time-consuming. Nevertheless, it has the advantage of being very close to the practice of radiologists and is considered as a reference method.¦Primarily, this work revealed that the statistical iterative reconstructions studied in clinic (ASIR? and VECO have a strong potential to reduce CT dose (up to -90%). However, by their mechanisms, they lead to a modification of the image appearance with a change in image texture which may then effect the quality of the diagnosis. By comparing the results of the "clinical" and "physical" approach, it was showed that a change in texture is related to a modification of the noise spectrum bandwidth. The NPS analysis makes possible to anticipate or avoid a decrease in image quality. This project demonstrated that integrating these new statistical iterative reconstruction techniques can be complex and cannot be made on the basis of protocols using conventional reconstructions. The conclusions of this work and the image quality tools developed will be able to guide future studies in the field of image quality as texture analysis or model observers dedicated to CT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a method for analyzing the curvature (second derivatives) of the conical intersection hyperline at an optimized critical point. Our method uses the projected Hessians of the degenerate states after elimination of the two branching space coordinates, and is equivalent to a frequency calculation on a single Born-Oppenheimer potential-energy surface. Based on the projected Hessians, we develop an equation for the energy as a function of a set of curvilinear coordinates where the degeneracy is preserved to second order (i.e., the conical intersection hyperline). The curvature of the potential-energy surface in these coordinates is the curvature of the conical intersection hyperline itself, and thus determines whether one has a minimum or saddle point on the hyperline. The equation used to classify optimized conical intersection points depends in a simple way on the first- and second-order degeneracy splittings calculated at these points. As an example, for fulvene, we show that the two optimized conical intersection points of C2v symmetry are saddle points on the intersection hyperline. Accordingly, there are further intersection points of lower energy, and one of C2 symmetry - presented here for the first time - is found to be the global minimum in the intersection space

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyze the impact of a minimum price variation (tick) and timepriority on the dynamics of quotes and the trading costs when competitionfor the order flow is dynamic. We find that convergence to competitiveoutcomes can take time and that the speed of convergence is influencedby the tick size, the priority rule and the characteristics of the orderarrival process. We show also that a zero minimum price variation is neveroptimal when competition for the order flow is dynamic. We compare thetrading outcomes with and without time priority. Time priority is shownto guarantee that uncompetitive spreads cannot be sustained over time.However it can sometimes result in higher trading costs. Empiricalimplications are proposed. In particular, we relate the size of thetrading costs to the frequency of new offers and the dynamics of theinside spread to the state of the book.