980 resultados para Problem Resolution


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The transmission network planning problem is a non-linear integer mixed programming problem (NLIMP). Most of the algorithms used to solve this problem use a linear programming subroutine (LP) to solve LP problems resulting from planning algorithms. Sometimes the resolution of these LPs represents a major computational effort. The particularity of these LPs in the optimal solution is that only some inequality constraints are binding. This task transforms the LP into an equivalent problem with only one equality constraint (the power flow equation) and many inequality constraints, and uses a dual simplex algorithm and a relaxation strategy to solve the LPs. The optimisation process is started with only one equality constraint and, in each step, the most unfeasible constraint is added. The logic used is similar to a proposal for electric systems operation planning. The results show a higher performance of the algorithm when compared to primal simplex methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Precipitation radar on board the TRMM satellite was a milestone in the rainfall observation capability in the large scale. Stemming from TRMM the new mission GPM (Global Precipitation Measurement) is to overcome some TRMM shortcomings like the high level of the PR MDZ. However, for major problems like the PR horizontal resolution, significant improvements are not foreseeable. This papers investigates the impact of TRMM PR resolution on the structure of tropical rainfall. The issue is approached by both gradient analysis and texture verification. Results indicate that the impact maybe significant, affecting important applications like in NWP. © 2005 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim: To report a possible case of tremor fluoxetine-induced treated as Parkinson’s disease in an elderly female patient noncompliant with the pharmacotherapy, with uncontrolled hypertension and using fluoxetine to treat depression. Presentation of Case: Patient complained of sleepiness in the morning, agitation, anxiety, insomnia and mental confusion. Her greatest concern was about bilateral hand tremors which, in her view became, worse after biperiden was prescribed. Therefore, she stopped taking it. The initial medication was: omeprazole, losartan, biperiden, fluoxetine, atenolol + chlorthalidone, acetylsalicylic acid, atorvastatin and diazepam. Pharmacotherapeutic follow up was performed in order to check the necessity, safety and effectiveness of treatment. Discussion: During the analysis of pharmacotherapy, the patient showed uncontrolled blood pressure and had difficulty complying with the treatment. Thus, in view of the complaints expressed by the patient, our first hypothesis was a possible serotonin syndrome related to fluoxetine use. We proposed a change in the fluoxetine regime and discontinuation of biperiden. As tremors persisted, we suggested the replacement of fluoxetine by sertraline, since a possible tremor fluoxetine-induced could explain the complaint. This approach solved the drug-related problem identified. Conclusion: Tremors reported by the patient was identified as an iatrogenic event related to fluoxetine, which was solved by management of serotonin-reuptake inhibitors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is known that massive black holes have a profound effect on the evolution of galaxies, and possibly on their formation by regulating the amount of gas available for the star formation. However, how black hole and galaxies communicate is still an open problem, depending on how much of the energy released interacts with the circumnuclear matter. In the last years, most studies of feedback have primarily focused on AGN jet/cavity systems in the most massive galaxy clusters. This thesis intends to investigate the feedback phenomena in radio--loud AGNs from a different perspective studying isolated radio galaxies, through high-resolution spectroscopy. In particular one NLRG and three BLRG are studied, searching for warm gas, both in emission and absorption, in the soft X-ray band. I show that the soft spectrum of 3C33 originates from gas photoionized by the central engine. I found for the first time WA in 3C382 and 3C390.3. I show that the observed warm emitter/absorbers is not uniform and probably located in the NLR. The detected WA is slow implying a mass outflow rate and kinetic luminosity always well below 1% the L(acc) as well as the P(jet). Finally the radio--loud properties are compared with those of type 1 RQ AGNs. A positive correlation is found between the mass outflow rate/kinetic luminosity, and the radio loudness. This seems to suggest that the presence of a radio source (the jet?) affects the distribution of the absorbing gas. Alternatively, if the gas distribution is similar in Seyferts and radio galaxies, the M(out) vs rl relation could simply indicate a major ejection of matter in the form of wind in powerful radio AGNs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we study a polyenergetic and multimaterial model for the breast image reconstruction in Digital Tomosynthesis, taking into consideration the variety of the materials forming the object and the polyenergetic nature of the X-rays beam. The modelling of the problem leads to the resolution of a high-dimensional nonlinear least-squares problem that, due to its nature of inverse ill-posed problem, needs some kind of regularization. We test two main classes of methods: the Levenberg-Marquardt method (together with the Conjugate Gradient method for the computation of the descent direction) and two limited-memory BFGS-like methods (L-BFGS). We perform some experiments for different values of the regularization parameter (constant or varying at each iteration), tolerances and stop conditions. Finally, we analyse the performance of the several methods comparing relative errors, iterations number, times and the qualities of the reconstructed images.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Feedback from the most massive components of a young stellar cluster deeply affects the surrounding ISM driving an expanding over-pressured hot gas cavity in it. In spiral galaxies these structures may have sufficient energy to break the disk and eject large amount of material into the halo. The cycling of this gas, which eventually will fall back onto the disk, is known as galactic fountains. We aim at better understanding the dynamics of such fountain flow in a Galactic context, frame the problem in a more dynamic environment possibly learning about its connection and regulation to the local driving mechanism and understand its role as a metal diffusion channel. The interaction of the fountain with a hot corona is hereby analyzed, trying to understand the properties and evolution of the extraplanar material. We perform high resolution hydrodynamical simulations with the moving-mesh code AREPO to model the multi-phase ISM of a Milky Way type galaxy. A non-equilibrium chemical network is included to self consistently follow the evolution of the main coolants of the ISM. Spiral arm perturbations in the potential are considered so that large molecular gas structures are able to dynamically form here, self shielded from the interstellar radiation field. We model the effect of SN feedback from a new-born stellar cluster inside such a giant molecular cloud, as the driving force of the fountain. Passive Lagrangian tracer particles are used in conjunction to the SN energy deposition to model and study diffusion of freshly synthesized metals. We find that both interactions with hot coronal gas and local ISM properties and motions are equally important in shaping the fountain. We notice a bimodal morphology where most of the ejected gas is in a cold $10^4$ K clumpy state while the majority of the affected volume is occupied by a hot diffuse medium. While only about 20\% of the produced metals stay local, most of them quickly diffuse through this hot regime to great scales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes informatics for cross-sample analysis with comprehensive two-dimensional gas chromatography (GCxGC) and high-resolution mass spectrometry (HRMS). GCxGC-HRMS analysis produces large data sets that are rich with information, but highly complex. The size of the data and volume of information requires automated processing for comprehensive cross-sample analysis, but the complexity poses a challenge for developing robust methods. The approach developed here analyzes GCxGC-HRMS data from multiple samples to extract a feature template that comprehensively captures the pattern of peaks detected in the retention-times plane. Then, for each sample chromatogram, the template is geometrically transformed to align with the detected peak pattern and generate a set of feature measurements for cross-sample analyses such as sample classification and biomarker discovery. The approach avoids the intractable problem of comprehensive peak matching by using a few reliable peaks for alignment and peak-based retention-plane windows to define comprehensive features that can be reliably matched for cross-sample analysis. The informatics are demonstrated with a set of 18 samples from breast-cancer tumors, each from different individuals, six each for Grades 1-3. The features allow classification that matches grading by a cancer pathologist with 78% success in leave-one-out cross-validation experiments. The HRMS signatures of the features of interest can be examined for determining elemental compositions and identifying compounds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Point Distribution Models (PDM) are among the most popular shape description techniques and their usefulness has been demonstrated in a wide variety of medical imaging applications. However, to adequately characterize the underlying modeled population it is essential to have a representative number of training samples, which is not always possible. This problem is especially relevant as the complexity of the modeled structure increases, being the modeling of ensembles of multiple 3D organs one of the most challenging cases. In this paper, we introduce a new GEneralized Multi-resolution PDM (GEM-PDM) in the context of multi-organ analysis able to efficiently characterize the different inter-object relations, as well as the particular locality of each object separately. Importantly, unlike previous approaches, the configuration of the algorithm is automated thanks to a new agglomerative landmark clustering method proposed here, which equally allows us to identify smaller anatomically significant regions within organs. The significant advantage of the GEM-PDM method over two previous approaches (PDM and hierarchical PDM) in terms of shape modeling accuracy and robustness to noise, has been successfully verified for two different databases of sets of multiple organs: six subcortical brain structures, and seven abdominal organs. Finally, we propose the integration of the new shape modeling framework into an active shape-model-based segmentation algorithm. The resulting algorithm, named GEMA, provides a better overall performance than the two classical approaches tested, ASM, and hierarchical ASM, when applied to the segmentation of 3D brain MRI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To explore cause and consequences of past climate change, very accurate age models such as those provided by the astronomical timescale (ATS) are needed. Beyond 40 million years the accuracy of the ATS critically depends on the correctness of orbital models and radioisotopic dating techniques. Discrepancies in the age dating of sedimentary successions and the lack of suitable records spanning the middle Eocene have prevented development of a continuous astronomically calibrated geological timescale for the entire Cenozoic Era. We now solve this problem by constructing an independent astrochronological stratigraphy based on Earth's stable 405 kyr eccentricity cycle between 41 and 48 million years ago (Ma) with new data from deep-sea sedimentary sequences in the South Atlantic Ocean. This new link completes the Paleogene astronomical timescale and confirms the intercalibration of radioisotopic and astronomical dating methods back through the Paleocene-Eocene Thermal Maximum (PETM, 55.930 Ma) and the Cretaceous-Paleogene boundary (66.022 Ma). Coupling of the Paleogene 405 kyr cyclostratigraphic frameworks across the middle Eocene further paves the way for extending the ATS into the Mesozoic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A fully 3D iterative image reconstruction algorithm has been developed for high-resolution PET cameras composed of pixelated scintillator crystal arrays and rotating planar detectors, based on the ordered subsets approach. The associated system matrix is precalculated with Monte Carlo methods that incorporate physical effects not included in analytical models, such as positron range effects and interaction of the incident gammas with the scintillator material. Custom Monte Carlo methodologies have been developed and optimized for modelling of system matrices for fast iterative image reconstruction adapted to specific scanner geometries, without redundant calculations. According to the methodology proposed here, only one-eighth of the voxels within two central transaxial slices need to be modelled in detail. The rest of the system matrix elements can be obtained with the aid of axial symmetries and redundancies, as well as in-plane symmetries within transaxial slices. Sparse matrix techniques for the non-zero system matrix elements are employed, allowing for fast execution of the image reconstruction process. This 3D image reconstruction scheme has been compared in terms of image quality to a 2D fast implementation of the OSEM algorithm combined with Fourier rebinning approaches. This work confirms the superiority of fully 3D OSEM in terms of spatial resolution, contrast recovery and noise reduction as compared to conventional 2D approaches based on rebinning schemes. At the same time it demonstrates that fully 3D methodologies can be efficiently applied to the image reconstruction problem for high-resolution rotational PET cameras by applying accurate pre-calculated system models and taking advantage of the system's symmetries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is general agreement within the scientific community in considering Biology as the science with more potential to develop in the XXI century. This is due to several reasons, but probably the most important one is the state of development of the rest of experimental and technological sciences. In this context, there are a very rich variety of mathematical tools, physical techniques and computer resources that permit to do biological experiments that were unbelievable only a few years ago. Biology is nowadays taking advantage of all these newly developed technologies, which are been applied to life sciences opening new research fields and helping to give new insights in many biological problems. Consequently, biologists have improved a lot their knowledge in many key areas as human function and human diseases. However there is one human organ that is still barely understood compared with the rest: The human brain. The understanding of the human brain is one of the main challenges of the XXI century. In this regard, it is considered a strategic research field for the European Union and the USA. Thus, there is a big interest in applying new experimental techniques for the study of brain function. Magnetoencephalography (MEG) is one of these novel techniques that are currently applied for mapping the brain activity1. This technique has important advantages compared to the metabolic-based brain imagining techniques like Functional Magneto Resonance Imaging2 (fMRI). The main advantage is that MEG has a higher time resolution than fMRI. Another benefit of MEG is that it is a patient friendly clinical technique. The measure is performed with a wireless set up and the patient is not exposed to any radiation. Although MEG is widely applied in clinical studies, there are still open issues regarding data analysis. The present work deals with the solution of the inverse problem in MEG, which is the most controversial and uncertain part of the analysis process3. This question is addressed using several variations of a new solving algorithm based in a heuristic method. The performance of those methods is analyzed by applying them to several test cases with known solutions and comparing those solutions with the ones provided by our methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose an innovative approach to tackle the problem of traffic sign detection using a computer vision algorithm and taking into account real-time operation constraints, trying to establish intelligent strategies to simplify as much as possible the algorithm complexity and to speed up the process. Firstly, a set of candidates is generated according to a color segmentation stage, followed by a region analysis strategy, where spatial characteristic of previously detected objects are taken into account. Finally, temporal coherence is introduced by means of a tracking scheme, performed using a Kalman filter for each potential candidate. Taking into consideration time constraints, efficiency is achieved two-fold: on the one side, a multi-resolution strategy is adopted for segmentation, where global operation will be applied only to low-resolution images, increasing the resolution to the maximum only when a potential road sign is being tracked. On the other side, we take advantage of the expected spacing between traffic signs. Namely, the tracking of objects of interest allows to generate inhibition areas, which are those ones where no new traffic signs are expected to appear due to the existence of a TS in the neighborhood. The proposed solution has been tested with real sequences in both urban areas and highways, and proved to achieve higher computational efficiency, especially as a result of the multi-resolution approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Complete resolution of the amide resonances in a three-dimensional solid-state NMR correlation spectrum of a uniformly 15N-labeled membrane protein in oriented phospholipid bilayers is demonstrated. The three orientationally dependent frequencies, 1H chemical shift, 1H–15N dipolar coupling, and 15N chemical shift, associated with each amide resonance are responsible for resolution among resonances and provide sufficient angular restrictions for protein structure determination. Because the protein is completely immobilized by the phospholipids on the relevant NMR time scales (10 kHz), the linewidths will not degrade in the spectra of larger proteins. Therefore, these results demonstrate that solid-state NMR experiments can overcome the correlation time problem and extend the range of proteins that can have their structures determined by NMR spectroscopy to include uniformly 15N-labeled membrane proteins in phospholipid bilayers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A high resolution, second-order central difference method for incompressible flows is presented. The method is based on a recent second-order extension of the classic Lax–Friedrichs scheme introduced for hyperbolic conservation laws (Nessyahu H. & Tadmor E. (1990) J. Comp. Physics. 87, 408-463; Jiang G.-S. & Tadmor E. (1996) UCLA CAM Report 96-36, SIAM J. Sci. Comput., in press) and augmented by a new discrete Hodge projection. The projection is exact, yet the discrete Laplacian operator retains a compact stencil. The scheme is fast, easy to implement, and readily generalizable. Its performance was tested on the standard periodic double shear-layer problem; no spurious vorticity patterns appear when the flow is underresolved. A short discussion of numerical boundary conditions is also given, along with a numerical example.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The issue: The European Union's pre-crisis growth performance was disappointing enough, but the performance has been even more dismal since the onset of the crisis. Weak growth is undermining private and public deleveraging,and is fuelling continued banking fragility. Persistently high unemployment is eroding skills, discouraging labour market participation and undermining the EU’s long-term growth potential. Low overall growth is making it much tougher for the hard-hit economies in southern Europe to recover competitiveness and regain control of their public finances. Stagnation would reduce the attractiveness of Europe for investment. Under these conditions, Europe's social models are bound to prove unsustainable. Policy Challenge: The European Union's weak long-term growth potential and unsatisfactory recovery from the crisis represent a major policy challenge. Over and above the structural reform agenda, which vitally important, bold policy action is needed. The priority is to get bank credit going. Banking problems need to be assessed properly and bank resolution and recapitalisation should be pursued. Second, fostering the reallocation of factors to the most productive firms and the sectors that contribute to aggregate rebalancing is vital. Addressing intra-euro area competitiveness divergence is essential to support growth in southern Europe. Third, the speed of fiscal adjustment needs to be appropriate and EU funds should be front loaded to countries in deep recession, while the European Investment Bank should increase investment.