995 resultados para Continuous Optimization


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The optimization of the pilot overhead in wireless fading channels is investigated, and the dependence of this overhead on various system parameters of interest (e.g., fading rate, signal-to-noise ratio) is quantified. The achievable pilot-based spectral efficiency is expanded with respect to the fading rate about the no-fading point, which leads to an accurate order expansion for the pilot overhead. This expansion identifies that the pilot overhead, as well as the spectral efficiency penalty with respect to a reference system with genie-aided CSI (channel state information) at the receiver, depend on the square root of the normalized Doppler frequency. It is also shown that the widely-usedblock fading model is a special case of more accurate continuous fading models in terms of the achievable pilot-based spectral efficiency. Furthermore, it is established that the overhead optimization for multiantenna systems is effectively the same as for single-antenna systems with thenormalized Doppler frequency multiplied by the number of transmit antennas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: Obstructive sleep apnea syndrome (OSA) increases the risk of cardiovascular disease. We aimed at evaluating the effect of continuous positive airway pressure (CPAP) treatment on coronary endothelium-dependent vasoreactivity in OSA patients by quantifying myocardial blood flow (MBF) response to cold pressure testing (CPT). METHODS: In the morning after polysomnography (PSG), all participants underwent a dynamic (82)Rb cardiac positron emitting tomography/computed tomography (PET/CT) scan at rest, during CPT and adenosine stress. PSG and PET/CT were repeated at least 6 weeks after initiating CPAP treatment. OSA patients were compared to controls and according to response to CPAP. Patients' characteristics and PSG parameters were used to determine predictors of CPT-MBF. RESULTS: Thirty-two untreated OSA patients (age 58 ± 13 years, 27 men) and 9 controls (age 62 ± 5 years, 4 men) were enrolled. At baseline, compared to controls (apnea-hypopnea index (AHI) = 5.3 ± 2.6/h), untreated OSA patients (AHI = 48.6 ± 19.7/h) tend to have a lower CPT-MBF (1.1 ± 0.2 mL/min/g vs. 1.3 ± 0.4 mL/min/g, p = 0.09). After initiating CPAP, CPT-MBF was not different between well-treated patients (AHI <10/h) and controls (1.3 ± 0.3 mL/min/g vs. 1.3 ± 0.4 mL/min/g, p = 0.83), but it was lower for insufficiently treated patients (AHI ≥10/h) (0.9 ± 0.2 mL/min/g vs. 1.3 ± 0.4 mL/min/g, p = 0.0045). CPT-MBF was also higher in well-treated than in insufficiently treated patients (1.3 ± 0.3 mL/min/g vs. 0.9 ± 0.2 mL/min/g, p = 0.001). Mean nocturnal oxygen saturation (β = -0.55, p = 0.02) and BMI (β = -0.58, p = 0.02) were independent predictors of CPT-MBF in OSA patients. CONCLUSIONS: Coronary endothelial vasoreactivity is impaired in insufficiently treated OSA patients compared to well-treated patients and controls, confirming the need for CPAP optimization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An evaluation of the performance of a continuous flow hydride generator-nebulizer for flame atomic absorption spectrometry was carried out. Optimization of nebulizer gas flow rate, sample acid concentration, sample and tetrahydroborate uptake rates and reductant concentration, on the As and Se absorbance signals was carried out. A hydrogen-argon flame was used. An improvement of the analytical sensitivity relative to the conventional bead nebulizer used in flame AA was obtained (2 (As) and 4.8 (Se) µg L-1). Detection limits (3σb) of 1 (As) and 1.3 (Se) µg L-1 were obtained. Accuracy of the method was checked by analyzing an oyster tissue reference material.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A company’s competence to manage its product portfolio complexity is becoming critically important in the rapidly changing business environment. The continuous evolvement of customer needs, the competitive market environment and internal product development lead to increasing complexity in product portfolios. The companies that manage the complexity in product development are more profitable in the long run. The complexity derives from product development and management processes where the new product variant development is not managed efficiently. Complexity is managed with modularization which is a method that divides the product structure into modules. In modularization, it is essential to take into account the trade-off between the perceived customer value and the module or component commonality across the products. Another goal is to enable the product configuration to be more flexible. The benefits are achieved through optimizing complexity in module offering and deriving the new product variants more flexibly and accurately. The developed modularization process includes the process steps for preparation, mapping the current situation, the creation of a modular strategy and implementing the strategy. Also the organization and support systems have to be adapted to follow-up targets and to execute modularization in practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this work was to optimize the parameter setup for GTAW of aluminum using an AC rectangular wave output and continuous feeding. A series of welds was carried-out in an industrial joint, with variation of the negative and positive current amplitude, the negative and positive duration time, the travel speed and the feeding speed. Another series was carried out to investigate the isolate effect of the negative duration time and travel speed. Bead geometry aspects were assessed, such as reinforcement, penetration, incomplete fusion and joint wall bridging. The results showed that currents at both polarities are remarkably more significant than the respective duration times. It was also shown that there is a straight relationship between welding speed and feeding speed and this relationship must be followed for obtaining sound beads. A very short positive duration time is enough for arc stability achievement and when the negative duration time is longer than 5 ms its effect on geometry appears. The possibility of optimizing the parameter selection, despite the high inter-correlation amongst them, was demonstrate through a computer program. An approach to reduce the number of variables in this process is also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Two-Connected Network with Bounded Ring (2CNBR) problem is a network design problem addressing the connection of servers to create a survivable network with limited redirections in the event of failures. Particle Swarm Optimization (PSO) is a stochastic population-based optimization technique modeled on the social behaviour of flocking birds or schooling fish. This thesis applies PSO to the 2CNBR problem. As PSO is originally designed to handle a continuous solution space, modification of the algorithm was necessary in order to adapt it for such a highly constrained discrete combinatorial optimization problem. Presented are an indirect transcription scheme for applying PSO to such discrete optimization problems and an oscillating mechanism for averting stagnation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Le présent mémoire décrit le développement d’une méthode de synthèse des hélicènes catalysée par la lumière visible. Les conditions pour la formation de [5]hélicène ont été établies par une optimisation du photocatalyseur, du solvant, du système d’oxydation et du temps réactionnel. Suite aux études mécanistiques préliminaires, un mécanisme oxydatif est proposé. Les conditions optimisées ont été appliquées à la synthèse de [6]hélicènes pour laquelle la régiosélectivité a été améliorée en ajoutant des substituants sur la colonne hélicale. La synthèse de thiohélicènes a aussi été testée en utilisant les mêmes conditions sous irradiation par la lumière visible. La méthode a été inefficace pour la formation de benzodithiophènes et de naphtothiophènes, par contre elle permet la formation du phenanthro[3,4-b]thiophène avec un rendement acceptable. En prolongeant la surface-π de la colonne hélicale, le pyrène a été fusionné aux motifs de [4]- et [5]hélicène. Trois dérivés de pyrène-hélicène ont été synthétisés en utilisant les conditions optimisées pour la photocyclisation et leurs caractéristiques physiques ont été étudiées. La méthode de cyclisation sous l’action de la lumière visible a aussi été étudiée en flux continu. Une optimisation du montage expérimental ainsi que de la source lumineuse a été effectuée et les meilleures conditions ont été appliquées à la formation de [5]hélicène et des trois dérivés du pyrène-hélicène. Une amélioration ou conservation des rendements a été observée pour la plupart des produits formés en flux continu comparativement à la synthèse en batch. La concentration de la réaction a aussi été conservée et le temps réactionnel a été réduit par un facteur de dix toujours en comparaison avec la synthèse en batch.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When triangulating a belief network we aim to obtain a junction tree of minimum state space. Searching for the optimal triangulation can be cast as a search over all the permutations of the network's vaeriables. Our approach is to embed the discrete set of permutations in a convex continuous domain D. By suitably extending the cost function over D and solving the continous nonlinear optimization task we hope to obtain a good triangulation with respect to the aformentioned cost. In this paper we introduce an upper bound to the total junction tree weight as the cost function. The appropriatedness of this choice is discussed and explored by simulations. Then we present two ways of embedding the new objective function into continuous domains and show that they perform well compared to the best known heuristic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present the results of GaInNAs/GaAs quantum dot structures with GaAsN barrier layers grown by solid source molecular beam epitaxy. Extension of the emission wavelength of GaInNAs quantum dots by ~170nm was observed in samples with GaAsN barriers in place of GaAs. However, optimization of the GaAsN barrier layer thickness is necessary to avoid degradation in luminescence intensity and structural property of the GaInNAs dots. Lasers with GaInNAs quantum dots as active layer were fabricated and room-temperature continuous-wave lasing was observed for the first time. Lasing occurs via the ground state at ~1.2μm, with threshold current density of 2.1kA/cm[superscript 2] and maximum output power of 16mW. These results are significantly better than previously reported values for this quantum-dot system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Deep Brain Stimulation (DBS) has been successfully used throughout the world for the treatment of Parkinson's disease symptoms. To control abnormal spontaneous electrical activity in target brain areas DBS utilizes a continuous stimulation signal. This continuous power draw means that its implanted battery power source needs to be replaced every 18–24 months. To prolong the life span of the battery, a technique to accurately recognize and predict the onset of the Parkinson's disease tremors in human subjects and thus implement an on-demand stimulator is discussed here. The approach is to use a radial basis function neural network (RBFNN) based on particle swarm optimization (PSO) and principal component analysis (PCA) with Local Field Potential (LFP) data recorded via the stimulation electrodes to predict activity related to tremor onset. To test this approach, LFPs from the subthalamic nucleus (STN) obtained through deep brain electrodes implanted in a Parkinson patient are used to train the network. To validate the network's performance, electromyographic (EMG) signals from the patient's forearm are recorded in parallel with the LFPs to accurately determine occurrences of tremor, and these are compared to the performance of the network. It has been found that detection accuracies of up to 89% are possible. Performance comparisons have also been made between a conventional RBFNN and an RBFNN based on PSO which show a marginal decrease in performance but with notable reduction in computational overhead.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effects of several fat replacement levels (0%, 35%, 50%, 70%, and 100%) by inulin in sponge cake microstructure and physicochemical properties were studied. Oil substitution for inulin decreased significantly (P < 0.05) batter viscosity, giving heterogeneous bubbles size distributions as it was observed by light microscopy. Using confocal laser scanning microscopy the fat was observed to be located at the bubbles’ interface, enabling an optimum crumb cake structure development during baking. Cryo-SEM micrographs of cake crumbs showed a continuous matrix with embedded starch granules and coated with oil; when fat replacement levels increased, starch granules appeared as detached structures. Cakes with fat replacement up to 70% had a high crumb air cell values; they were softer and rated as acceptable by an untrained sensory panel (n = 51). So, the reformulation of a standard sponge cake recipe to obtain a new product with additional health benefits and accepted by consumers is achieved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Model trees are a particular case of decision trees employed to solve regression problems. They have the advantage of presenting an interpretable output, helping the end-user to get more confidence in the prediction and providing the basis for the end-user to have new insight about the data, confirming or rejecting hypotheses previously formed. Moreover, model trees present an acceptable level of predictive performance in comparison to most techniques used for solving regression problems. Since generating the optimal model tree is an NP-Complete problem, traditional model tree induction algorithms make use of a greedy top-down divide-and-conquer strategy, which may not converge to the global optimal solution. In this paper, we propose a novel algorithm based on the use of the evolutionary algorithms paradigm as an alternate heuristic to generate model trees in order to improve the convergence to globally near-optimal solutions. We call our new approach evolutionary model tree induction (E-Motion). We test its predictive performance using public UCI data sets, and we compare the results to traditional greedy regression/model trees induction algorithms, as well as to other evolutionary approaches. Results show that our method presents a good trade-off between predictive performance and model comprehensibility, which may be crucial in many machine learning applications. (C) 2010 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Nonlinear Programming algorithm that converges to second-order stationary points is introduced in this paper. The main tool is a second-order negative-curvature method for box-constrained minimization of a certain class of functions that do not possess continuous second derivatives. This method is used to define an Augmented Lagrangian algorithm of PHR (Powell-Hestenes-Rockafellar) type. Convergence proofs under weak constraint qualifications are given. Numerical examples showing that the new method converges to second-order stationary points in situations in which first-order methods fail are exhibited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we introduce a necessary sequential Approximate-Karush-Kuhn-Tucker (AKKT) condition for a point to be a solution of a continuous variational inequality, and we prove its relation with the Approximate Gradient Projection condition (AGP) of Garciga-Otero and Svaiter. We also prove that a slight variation of the AKKT condition is sufficient for a convex problem, either for variational inequalities or optimization. Sequential necessary conditions are more suitable to iterative methods than usual punctual conditions relying on constraint qualifications. The AKKT property holds at a solution independently of the fulfillment of a constraint qualification, but when a weak one holds, we can guarantee the validity of the KKT conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main idea of this research to solve the problem of inventory management for the paper industry SPM PVT limited. The aim of this research was to find a methodology by which the inventory of raw material could be kept at minimum level by means of buffer stock level.The main objective then lies in finding the minimum level of buffer stock according to daily consumption of raw material, finding the Economic Order Quantity (EOQ) reorders point and how much order will be placed in a year to control the shortage of raw material.In this project, we discuss continuous review model (Deterministic EOQ models) that includes the probabilistic demand directly in the formulation. According to the formula, we see the reorder point and the order up to model. The problem was tackled mathematically as well as simulation modeling was used where mathematically tractable solution was not possible.The simulation modeling was done by Awesim software for developing the simulation network. This simulation network has the ability to predict the buffer stock level based on variable consumption of raw material and lead-time. The data collection for this simulation network is taken from the industrial engineering personnel and the departmental studies of the concerned factory. At the end, we find the optimum level of order quantity, reorder point and order days.