773 resultados para minimization


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose a compressive sensing algorithm that exploits geometric properties of images to recover images of high quality from few measurements. The image reconstruction is done by iterating the two following steps: 1) estimation of normal vectors of the image level curves, and 2) reconstruction of an image fitting the normal vectors, the compressed sensing measurements, and the sparsity constraint. The proposed technique can naturally extend to nonlocal operators and graphs to exploit the repetitive nature of textured images to recover fine detail structures. In both cases, the problem is reduced to a series of convex minimization problems that can be efficiently solved with a combination of variable splitting and augmented Lagrangian methods, leading to fast and easy-to-code algorithms. Extended experiments show a clear improvement over related state-of-the-art algorithms in the quality of the reconstructed images and the robustness of the proposed method to noise, different kind of images, and reduced measurements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A parametric procedure for the blind inversion of nonlinear channels is proposed, based on a recent method of blind source separation in nonlinear mixtures. Experiments show that the proposed algorithms perform efficiently, even in the presence of hard distortion. The method, based on the minimization of the output mutual information, needs the knowledge of log-derivative of input distribution (the so-called score function). Each algorithm consists of three adaptive blocks: one devoted to adaptive estimation of the score function, and two other blocks estimating the inverses of the linear and nonlinear parts of the channel, (quasi-)optimally adapted using the estimated score functions. This paper is mainly concerned by the nonlinear part, for which we propose two parametric models, the first based on a polynomial model and the second on a neural network, while [14, 15] proposed non-parametric approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although sources in general nonlinear mixturm arc not separable iising only statistical independence, a special and realistic case of nonlinear mixtnres, the post nonlinear (PNL) mixture is separable choosing a suited separating system. Then, a natural approach is based on the estimation of tho separating Bystem parameters by minimizing an indcpendence criterion, like estimated mwce mutual information. This class of methods requires higher (than 2) order statistics, and cannot separate Gaarsian sources. However, use of [weak) prior, like source temporal correlation or nonstationarity, leads to other source separation Jgw rithms, which are able to separate Gaussian sourra, and can even, for a few of them, works with second-order statistics. Recently, modeling time correlated s011rces by Markov models, we propose vcry efficient algorithms hmed on minimization of the conditional mutual information. Currently, using the prior of temporally correlated sources, we investigate the fesihility of inverting PNL mixtures with non-bijectiw non-liacarities, like quadratic functions. In this paper, we review the main ICA and BSS results for riunlinear mixtures, present PNL models and algorithms, and finish with advanced resutts using temporally correlated snu~sm

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the spin dynamics of quasi-one-dimensional F=1 condensates both at zero and finite temperatures for arbitrary initial spin configurations. The rich dynamical evolution exhibited by these nonlinear systems is explained by surprisingly simple principles: minimization of energy at zero temperature and maximization of entropy at high temperature. Our analytical results for the homogeneous case are corroborated by numerical simulations for confined condensates in a wide variety of initial conditions. These predictions compare qualitatively well with recent experimental observations and can, therefore, serve as a guidance for ongoing experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The protein shells, or capsids, of nearly all spherelike viruses adopt icosahedral symmetry. In the present Letter, we propose a statistical thermodynamic model for viral self-assembly. We find that icosahedral symmetry is not expected for viral capsids constructed from structurally identical protein subunits and that this symmetry requires (at least) two internal switching configurations of the protein. Our results indicate that icosahedral symmetry is not a generic consequence of free energy minimization but requires optimization of internal structural parameters of the capsid proteins

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Markets, in the real world, are not efficient zero-sum games where hypotheses of the CAPM are fulfilled. Then, it is easy to conclude the market portfolio is not located on Markowitz"s efficient frontier, and passive investments (and indexing) are not optimal but biased. In this paper, we define and analyze biases suffered by passive investors: the sample, construction, efficiency and active biases and tracking error are presented. We propose Minimum Risk Indices (MRI) as an alternative to deal with to market index biases, and to provide investors with portfolios closer to the efficient frontier, that is, more optimal investment possibilities. MRI (using a Parametric Value-at-Risk Minimization approach) are calculated for three stock markets achieving interesting results. Our indices are less risky and more profitable than current Market Indices in the Argentinean and Spanish markets, facing that way the Efficient Market Hypothesis. Two innovations must be outlined: an error dimension has been included in the backtesting and the Sharpe"s Ratio has been used to select the"best" MRI

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Magnetization versus temperature in the temperature interval 2-200 K was measured for amorphous alloys of three different compositions: Fe 81.5B14.5Si4, Fe40Ni38 Mo4B18, and Co70Fe5Ni 2Mo3B5Si15. The measurements were performed by means of a SQUID (superconducting quantum interference device) magnetometer. The aim was to extract information about the different mechanisms contributing to thermal demagnetization. A powerful data analysis technique based on successive minimization procedures has demonstrated that Stoner excitations of the strong ferromagnetic type play a significant role in the Fe-Ni alloy studied. The Fe-rich and Co-rich alloys do not show a measurable contribution from single-particle excitations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose/Objective(s): Letrozole radiosensitizes breast cancer cells in vitro. In clinical settings, no data exist for the combination of letrozole and radiotherapy. We assessed concurrent and sequential radiotherapy and letrozole in the adjuvant setting.Materials/Methods: The present study is registered with ClinicalTrials.gov, number NCT00208273. This Phase 2 randomized trial was undertaken in two centers in France and one in Switzerland between January 12, 2005, and February 21, 2007. One hundred fifty postmenopausal women with early-stage breast cancer were randomly assigned after conserving surgery to either concurrent radiotherapy and letrozole (n = 75) or sequential radiotherapy and letrozole (n = 75). Randomization was open label with a minimization technique, stratified by investigational centers, chemotherapy (yes vs. no), radiation boost (yes vs. no), and value of radiation-induced lymphocyte apoptosis (#16% vs. .16%). The whole breast was irradiated to a total dose of 50 Gy in 25 fractions over 5 weeks. In the case of supraclavicular and internal mammary node irradiation, the dose was 44 - 50 Gy. Letrozole was administered orally once daily at a dose of 2 - 5 mg for 5 years (beginning 3 weeks pre-radiotherapy in the concomitant group, and 3 weeks postradiotherapy in the sequential group). The primary endpoint was the occurrence of acute (during and within 6 weeks of radiotherapy) and late (within 2 years) radiation-induced Grade 2 or worse toxic effects of the skin and lung (functional pulmonary test and lung CT-scan). Analyses were by intention-to-treat. The long-term follow-up after 2 years was only performed in Montpellier (n = 121) and evaluated skin toxicity (clinical examination every 6 months), lung fibrosis (one CT-scan yearly), cosmetic outcome.Results: All patients were analyzed apart from 1 in the concurrent group who withdrew consent before any treatment.Within the first 2 years (n = 149), no lung toxicity was identified by CT scan and no modification from baseline was noted by the lung diffusion capacity test. Two patients in each group had Grade 2 or worse late effects (both radiation-induced subcutaneous fibrosis [RISF]). After 2 years (n = 121), and with a median follow-up of 50 months (38-62), 2 patients (1 in each arm) presented a Grade 3 RISF. No lung toxicity was identified by CT scan. Cosmetic results (photographies) and quality of life was good to excellent. All patients who had Grade 3 subcutaneous fibrosis had an RILA value of 16% or less, irrespective of the sequence with letrozole.Conclusions:With long-term follow-up, letrozole can be safely delivered shortly after surgery and concomitantly with radiotherapy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a Bayesian approach to the design of transmit prefiltering matrices in closed-loop schemes robust to channel estimation errors. The algorithms are derived for a multiple-input multiple-output (MIMO) orthogonal frequency division multiplexing (OFDM) system. Two different optimizationcriteria are analyzed: the minimization of the mean square error and the minimization of the bit error rate. In both cases, the transmitter design is based on the singular value decomposition (SVD) of the conditional mean of the channel response, given the channel estimate. The performance of the proposed algorithms is analyzed,and their relationship with existing algorithms is indicated. As withother previously proposed solutions, the minimum bit error rate algorithmconverges to the open-loop transmission scheme for very poor CSI estimates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The suitable timing of capacity investments is a remarkable issue especially in capital intensive industries. Despite its importance, fairly few studies have been published on the topic. In the present study models for the timing of capacity change in capital intensive industry are developed. The study considers mainly the optimal timing of single capacity changes. The review of earlier research describes connections between cost, capacity and timing literature, and empirical examples are used to describe the starting point of the study and to test the developed models. The study includes four models, which describe the timing question from different perspectives. The first model, which minimizes unit costs, has been built for capacity expansion and replacement situations. It is shown that the optimal timing of an investment can be presented with the capacity and cost advantage ratios. After the unit cost minimization model the view is extended to the direction of profit maximization. The second model states that early investments are preferable if the change of fixed costs is small compared to the change of the contribution margin. The third model is a numerical discounted cash flow model, which emphasizes the roles of start-up time, capacity utilization rate and value of waiting as drivers of the profitable timing of a project. The last model expands the view from project level to company level and connects the flexibility of assets and cost structures to the timing problem. The main results of the research are the solutions of the models and analysis or simulations done with the models. The relevance and applicability of the results are verified by evaluating the logic of the models and by numerical cases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This correspondence studies the formulation of members ofthe Cohen-Posch class of positive time-frequency energy distributions.Minimization of cross-entropy measures with respect to different priorsand the case of no prior or maximum entropy were considered. It isconcluded that, in general, the information provided by the classicalmarginal constraints is very limited, and thus, the final distributionheavily depends on the prior distribution. To overcome this limitation,joint time and frequency marginals are derived based on a "directioninvariance" criterion on the time-frequency plane that are directly relatedto the fractional Fourier transform.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tutkielman tavoitteena oli osoittaa minkälaisia vaikutuksia työnantajan sivukulujen poisto toisi työllisyyteen. Tutkielman empiirinen osio toteutettiin survey-tutkimuksella. Tämä toteutettiin postikyselyllä, jossa 810 Etelä-Karjalalaiselle yritykselle lähetettiin joulukuussa 2004 kyselylomake. Kyselyyn vastanneiden yritysten määrä oli 141 ja vastausprosentti oli 17. Vastanneiden yritysten toimialajakauma vastasi koko Etelä-Karjalan alueen yritysten toimialajakaumaa, jossa suurimpana toimialana oli eri palvelualan yritykset. Tutkielma eroaa aikaisemminista työllisyyden parantamiseksi esitetyistä malleista. Ensimmäineneroavaisuus on sivukulujen poistaminen yrityksiltä kokonaan eikä vain muutamilla prosenttiyksiköillä. Toinen eroavaisuus on sivukulujen poiston kohdistamisessavain niille yrityksille, joilla työllistävyys kasvaa merkittävästi. Näiden tarkoituksena on maksimoida työllisyysvaikutukset ja vaikuttaa aiheutuneiden kustannusten minimointiin tai kustannusten korvaantumiseen täysimääräisesti. Asetetut tavoitteet voidaan saavuttaa kohdistamalla sivukulujen poisto vain niihin yrityksiin, jotka työllistävät alle 5 henkilöä. Näiden yritysten kohdalla mahdollistuvauusien työntekijöiden määrä nousee keskimäärin 36 prosentilla. Tämä merkitsee laskelmien mukaan sitä, että sivululujen poiston aiheuttama sosiaalivakuutusmaksujen väheneminen korvaantuu vanhojen ja uusien työntekijöiden maksamilla tuloveroilla ja vähentyvillä työttömyyskorvauksilla. Uusien työpaikkojen kasvu merkitseeEtelä-Karjalan työttömyysprosentin laskua yli 4 prosenttiyksiköllä.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tutkimus suomalaisten yritysten liiketoimintamahdollisuuksista hiilidoksidipäästöjen vähentämisen parissa Luoteis-Venäjällä.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The changing business environment demands that chemical industrial processes be designed such that they enable the attainment of multi-objective requirements and the enhancement of innovativedesign activities. The requirements and key issues for conceptual process synthesis have changed and are no longer those of conventional process design; there is an increased emphasis on innovative research to develop new concepts, novel techniques and processes. A central issue, how to enhance the creativity of the design process, requires further research into methodologies. The thesis presentsa conflict-based methodology for conceptual process synthesis. The motivation of the work is to support decision-making in design and synthesis and to enhance the creativity of design activities. It deals with the multi-objective requirements and combinatorially complex nature of process synthesis. The work is carriedout based on a new concept and design paradigm adapted from Theory of InventiveProblem Solving methodology (TRIZ). TRIZ is claimed to be a `systematic creativity' framework thanks to its knowledge based and evolutionary-directed nature. The conflict concept, when applied to process synthesis, throws new lights on design problems and activities. The conflict model is proposed as a way of describing design problems and handling design information. The design tasks are represented as groups of conflicts and conflict table is built as the design tool. The general design paradigm is formulated to handle conflicts in both the early and detailed design stages. The methodology developed reflects the conflict nature of process design and synthesis. The method is implemented and verified through case studies of distillation system design, reactor/separator network design and waste minimization. Handling the various levels of conflicts evolve possible design alternatives in a systematic procedure which consists of establishing an efficient and compact solution space for the detailed design stage. The approach also provides the information to bridge the gap between the application of qualitative knowledge in the early stage and quantitative techniques in the detailed design stage. Enhancement of creativity is realized through the better understanding of the design problems gained from the conflict concept and in the improvement in engineering design practice via the systematic nature of the approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Superheater corrosion causes vast annual losses for the power companies. With a reliable corrosion prediction method, the plants can be designed accordingly, and knowledge of fuel selection and determination of process conditions may be utilized to minimize superheater corrosion. Growing interest to use recycled fuels creates additional demands for the prediction of corrosion potential. Models depending on corrosion theories will fail, if relations between the inputs and the output are poorly known. A prediction model based on fuzzy logic and an artificial neural network is able to improve its performance as the amount of data increases. The corrosion rate of a superheater material can most reliably be detected with a test done in a test combustor or in a commercial boiler. The steel samples can be located in a special, temperature-controlled probe, and exposed to the corrosive environment for a desired time. These tests give information about the average corrosion potential in that environment. Samples may also be cut from superheaters during shutdowns. The analysis ofsamples taken from probes or superheaters after exposure to corrosive environment is a demanding task: if the corrosive contaminants can be reliably analyzed, the corrosion chemistry can be determined, and an estimate of the material lifetime can be given. In cases where the reason for corrosion is not clear, the determination of the corrosion chemistry and the lifetime estimation is more demanding. In order to provide a laboratory tool for the analysis and prediction, a newapproach was chosen. During this study, the following tools were generated: · Amodel for the prediction of superheater fireside corrosion, based on fuzzy logic and an artificial neural network, build upon a corrosion database developed offuel and bed material analyses, and measured corrosion data. The developed model predicts superheater corrosion with high accuracy at the early stages of a project. · An adaptive corrosion analysis tool based on image analysis, constructedas an expert system. This system utilizes implementation of user-defined algorithms, which allows the development of an artificially intelligent system for thetask. According to the results of the analyses, several new rules were developed for the determination of the degree and type of corrosion. By combining these two tools, a user-friendly expert system for the prediction and analyses of superheater fireside corrosion was developed. This tool may also be used for the minimization of corrosion risks by the design of fluidized bed boilers.