922 resultados para Wiener criterion test, criterion heat, calore, Laplace


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Uusi EPR-reaktorikonsepti on suunniteltu selviytymään tapauksista, joissa reaktorinsydän sulaa ja sula puhkaisee paineastian. Suojarakennuksen sisälle on suunniteltu alue, jolle sula passiivisesti kerätään, pidätetään ja jäähdytetään. Alueelle laaditaan valurautaelementeistä ns.sydänsieppari, joka tulvitetaan vedellä. Sydänsulan tuottama jälkilämpö siirtyyveteen, mistä se poistetaan suojarakennuksen jälkilämmönpoistojärjestelmän kautta. Suuri osa lämmöstä poistuu sydänsulasta sen yläpuolella olevaan veteen, mutta lämmönsiirron tehostamiseksi myös sydänsiepparin alapuolelle on sijoitettu vedellä täytettävät jäähdytyskanavat. Jotta sydänsiepparin toiminta voitaisiin todentaa, on Lappeenrannan Teknillisellä Yliopistolla rakennettu Volley-koelaitteisto tätä tarkoitusta varten. Koelaitteisto koostuu kahdesta täysimittaisesta valuraudasta tehdystä jäähdytyskanavasta. Sydänsulan tuottamaa jälkilämpöä simuloidaan koelaitteistossa sähkövastuksilla. Tässä työssä kuvataan simulaatioiden suorittaminen ja vertaillaan saatuja arvoja mittaustuloksiin. Työ keskittyy sydänsiepparista jäähdytyskanaviin tapahtuvan lämmönsiirron teoriaan jamekanismeihin. Työssä esitetään kolme erilaista korrelaatiota lämmönsiirtokertoimille allaskiehumisen tapauksessa. Nämä korrelaatiot soveltuvat erityisesti tapauksiin, joissa vain muutamia mittausparametreja on tiedossa. Työn toinen osa onVolley 04 -kokeiden simulointi. Ensin käytettyä simulointitapaa on kelpoistettuvertaamalla tuloksia Volley 04 ja 05 -kokeisiin, joissa koetta voitiin jatkaa tasapainotilaan ja joissa jäähdytteen käyttäytyminen jäähdytyskanavassa on tallennettu myös videokameralla. Näiden simulaatioiden tulokset ovat hyvin samanlaisiakuin mittaustulokset. Korkeammilla lämmitystehoilla kokeissa esiintyi vesi-iskuja, jotka rikkoivat videoinnin mahdollistavia ikkunoita. Tämän johdosta osassa Volley 04 -kokeita ikkunat peitettiin metallilevyillä. Joitakin kokeita jouduttiin keskeyttämään laitteiston suurten lämpöjännitysten johdosta. Tällaisten testien simulaatiot eivät ole yksinkertaisia suorittaa. Veden pinnan korkeudesta ei ole visuaalista havaintoa. Myöskään jäähdytteen tasapainotilanlämpötiloista ei ole tarkkaa tietoa, mutta joitakin oletuksia voidaan tehdä samoilla parametreilla tehtyjen Volley 05 -kokeiden perusteella. Mittaustulokset Volley 04 ja 05 -kokeista, jotka on videoitu ja voitu ajaa tasapainotilaan saakka, antoivat simulaatioiden kanssa hyvin samankaltaisia lämpötilojen arvoja. Keskeytettyjen kokeiden ekstrapolointi tasapainotilaan ei onnistunut kovin hyvin. Kokeet jouduttiin keskeyttämään niin paljon ennen termohydraulista tasapainoa, ettei tasapainotilan reunaehtoja voitu ennustaa. Videonauhoituksen puuttuessa ei veden pinnan korkeudesta saatu lisätietoa. Tuloksista voidaan lähinnä esittää arvioita siitä, mitä suuruusluokkaa mittapisteiden lämpötilat tulevat olemaan. Nämä lämpötilat ovat kuitenkin selvästi alle sydänsiepparissa käytettävän valuraudan sulamislämpötilan. Joten simulaatioiden perusteella voidaan sanoa, etteivät jäähdytyskanavien rakenteet sula, mikäli niissä on pienikin jäähdytevirtaus, eikä useampia kuin muutama vierekkäinen kanava ole täysin kuivana.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this study, a model for the unsteady dynamic behaviour of a once-through counter flow boiler that uses an organic working fluid is presented. The boiler is a compact waste-heat boiler without a furnace and it has a preheater, a vaporiser and a superheater. The relative lengths of the boiler parts vary with the operating conditions since they are all parts of a single tube. The present research is a part of a study on the unsteady dynamics of an organic Rankine cycle power plant and it will be a part of a dynamic process model. The boiler model is presented using a selected example case that uses toluene as the process fluid and flue gas from natural gas combustion as the heat source. The dynamic behaviour of the boiler means transition from the steady initial state towards another steady state that corresponds to the changed process conditions. The solution method chosen was to find such a pressure of the process fluid that the mass of the process fluid in the boiler equals the mass calculated using the mass flows into and out of the boiler during a time step, using the finite difference method. A special method of fast calculation of the thermal properties has been used, because most of the calculation time is spent in calculating the fluid properties. The boiler was divided into elements. The values of the thermodynamic properties and mass flows were calculated in the nodes that connect the elements. Dynamic behaviour was limited to the process fluid and tube wall, and the heat source was regarded as to be steady. The elements that connect the preheater to thevaporiser and the vaporiser to the superheater were treated in a special way that takes into account a flexible change from one part to the other. The model consists of the calculation of the steady state initial distribution of the variables in the nodes, and the calculation of these nodal values in a dynamic state. The initial state of the boiler was received from a steady process model that isnot a part of the boiler model. The known boundary values that may vary during the dynamic calculation were the inlet temperature and mass flow rates of both the heat source and the process fluid. A brief examination of the oscillation around a steady state, the so-called Ledinegg instability, was done. This examination showed that the pressure drop in the boiler is a third degree polynomial of the mass flow rate, and the stability criterion is a second degree polynomial of the enthalpy change in the preheater. The numerical examination showed that oscillations did not exist in the example case. The dynamic boiler model was analysed for linear and step changes of the entering fluid temperatures and flow rates.The problem for verifying the correctness of the achieved results was that there was no possibility o compare them with measurements. This is why the only way was to determine whether the obtained results were intuitively reasonable and the results changed logically when the boundary conditions were changed. The numerical stability was checked in a test run in which there was no change in input values. The differences compared with the initial values were so small that the effects of numerical oscillations were negligible. The heat source side tests showed that the model gives results that are logical in the directions of the changes, and the order of magnitude of the timescale of changes is also as expected. The results of the tests on the process fluid side showed that the model gives reasonable results both on the temperature changes that cause small alterations in the process state and on mass flow rate changes causing very great alterations. The test runs showed that the dynamic model has no problems in calculating cases in which temperature of the entering heat source suddenly goes below that of the tube wall or the process fluid.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis examines the application of data envelopment analysis as an equity portfolio selection criterion in the Finnish stock market during period 2001-2011. A sample of publicly traded firms in the Helsinki Stock Exchange is examined in this thesis. The sample covers the majority of the publicly traded firms in the Helsinki Stock Exchange. Data envelopment analysis is used to determine the efficiency of firms using a set of input and output financial parameters. The set of financial parameters consist of asset utilization, liquidity, capital structure, growth, valuation and profitability measures. The firms are divided into artificial industry categories, because of the industry-specific nature of the input and output parameters. Comparable portfolios are formed inside the industry category according to the efficiency scores given by the DEA and the performance of the portfolios is evaluated with several measures. The empirical evidence of this thesis suggests that with certain limitations, data envelopment analysis can successfully be used as portfolio selection criterion in the Finnish stock market when the portfolios are rebalanced at annual frequency according to the efficiency scores given by the data envelopment analysis. However, when the portfolios were rebalanced every two or three years, the results are mixed and inconclusive.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Although alcohol problems and alcohol consumption are related, consumption does not fully account for differences in vulnerability to alcohol problems. Therefore, other factors should account for these differences. Based on previous research, it was hypothesized that risky drinking behaviours, illicit and prescription drug use, affect and sex differences would account for differences in vulnerability to alcohol problems while statistically controlling for overall alcohol consumption. Four models were developed that were intended to test the predictive ability of these factors, three of which tested the predictor sets separately and a fourth which tested them in a combined model. In addition, two distinct criterion variables were regressed on the predictors. One was a measure of the frequency that participants experienced negative consequences that they attributed to their drinking and the other was a measure of the extent to which participants perceived themselves to be problem drinkers. Each of the models was tested on four samples from different populations, including fIrst year university students, university students in their graduating year, a clinical sample of people in treatment for addiction, and a community sample of young adults randomly selected from the general population. Overall, support was found for each of the models and each of the predictors in accounting for differences in vulnerability to alcohol problems. In particular, the frequency with which people become intoxicated, frequency of illicit drug use and high levels of negative affect were strong and consistent predictors of vulnerability to alcohol problems across samples and criterion variables. With the exception of the clinical sample, the combined models predicted vulnerability to negative consequences better than vulnerability to problem drinker status. Among the clinical and community samples the combined model predicted problem drinker status better than in the student samples.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Certains symptômes sont les indicateurs incontestés des très graves problèmes que connaît l’Église. S’ils existent aussi dans des confessions et des religions diverses, seuls seront examinés ici ceux qui concernent l’Église catholique. Parmi les plus significatifs figurent un fort déclin dans la participation à des activités religieuses comme les célébrations eucharistiques dominicales, surtout chez les jeunes, une pénurie presque catastrophique de prêtres ordonnés, une perte de prestige et d’influence de l’enseignement dispensé par l’Église. Ces symptômes varient en intensité selon les pays, mais les statistiques indiquent qu’ils se multiplient. Nombre de ces problèmes sont attribuables à l’extrême vélocité de changements qui surviennent partout et à l’apparente inaptitude de l’Église à s’adapter, en raison notamment de son attachement à la pensée néo-scolastique et à la tradition tridentine. Cette fidélité absolue à une tradition vieille de quatre cents ans l’empêche de se faire à un environnement en évolution rapide et radicale. Des changements appropriés s’imposent pratiquement partout dans l’Église. Or, pour que ceux-ci soient efficaces et respectueux de la nature propre de l’Église, la tradition est un guide qui ne suffit pas. S’appuyant sur les termes de l’encyclique Ecclesia de Eucharistia, « le moment décisif où elle (l’Église) a pris forme est certainement celui où a eu lieu l’institution de l’Eucharistie, dans la chambre à l’étage », la thèse présentée suit le plus près possible l’interprétation donnée aux paroles de Jésus, ceci est mon corps, telles qu’elles ont été prononcées la première fois. Selon cette évidence, il est permis d’affirmer que les caractéristiques définitoires de l’Église provenant de ces mots sont agape, unité, service. Tel doit être le principe directeur des changements. C’est sur une telle base que sont décrits les secteurs où les changements s’imposent ainsi que les aspects visés. Ces changements comprennent les points suivants : liturgie, sacrements, catéchèse, mystagogie, théologie, structure, gouvernance de l’Église et ses enseignements, évangélisation. Ces secteurs exigent des efforts sérieux dans la préparation des personnes touchées par ces changements et dans l’attention portée à l’exigence primordiale voulant qu’agape, unité et service soient les principes actifs et évidents régissant l’Église.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The no response test is a new scheme in inverse problems for partial differential equations which was recently proposed in [D. R. Luke and R. Potthast, SIAM J. Appl. Math., 63 (2003), pp. 1292–1312] in the framework of inverse acoustic scattering problems. The main idea of the scheme is to construct special probing waves which are small on some test domain. Then the response for these waves is constructed. If the response is small, the unknown object is assumed to be a subset of the test domain. The response is constructed from one, several, or many particular solutions of the problem under consideration. In this paper, we investigate the convergence of the no response test for the reconstruction information about inclusions D from the Cauchy values of solutions to the Helmholtz equation on an outer surface $\partial\Omega$ with $\overline{D} \subset \Omega$. We show that the one‐wave no response test provides a criterion to test the analytic extensibility of a field. In particular, we investigate the construction of approximations for the set of singular points $N(u)$ of the total fields u from one given pair of Cauchy data. Thus, the no response test solves a particular version of the classical Cauchy problem. Also, if an infinite number of fields is given, we prove that a multifield version of the no response test reconstructs the unknown inclusion D. This is the first convergence analysis which could be achieved for the no response test.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Most factorial experiments in industrial research form one stage in a sequence of experiments and so considerable prior knowledge is often available from earlier stages. A Bayesian A-optimality criterion is proposed for choosing designs, when each stage in experimentation consists of a small number of runs and the objective is to optimise a response. Simple formulae for the weights are developed, some examples of the use of the design criterion are given and general recommendations are made. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We describe, and make publicly available, two problem instance generators for a multiobjective version of the well-known quadratic assignment problem (QAP). The generators allow a number of instance parameters to be set, including those controlling epistasis and inter-objective correlations. Based on these generators, several initial test suites are provided and described. For each test instance we measure some global properties and, for the smallest ones, make some initial observations of the Pareto optimal sets/fronts. Our purpose in providing these tools is to facilitate the ongoing study of problem structure in multiobjective (combinatorial) optimization, and its effects on search landscape and algorithm performance.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A hybridised and Knowledge-based Evolutionary Algorithm (KEA) is applied to the multi-criterion minimum spanning tree problems. Hybridisation is used across its three phases. In the first phase a deterministic single objective optimization algorithm finds the extreme points of the Pareto front. In the second phase a K-best approach finds the first neighbours of the extreme points, which serve as an elitist parent population to an evolutionary algorithm in the third phase. A knowledge-based mutation operator is applied in each generation to reproduce individuals that are at least as good as the unique parent. The advantages of KEA over previous algorithms include its speed (making it applicable to large real-world problems), its scalability to more than two criteria, and its ability to find both the supported and unsupported optimal solutions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents an efficient construction algorithm for obtaining sparse kernel density estimates based on a regression approach that directly optimizes model generalization capability. Computational efficiency of the density construction is ensured using an orthogonal forward regression, and the algorithm incrementally minimizes the leave-one-out test score. A local regularization method is incorporated naturally into the density construction process to further enforce sparsity. An additional advantage of the proposed algorithm is that it is fully automatic and the user is not required to specify any criterion to terminate the density construction procedure. This is in contrast to an existing state-of-art kernel density estimation method using the support vector machine (SVM), where the user is required to specify some critical algorithm parameter. Several examples are included to demonstrate the ability of the proposed algorithm to effectively construct a very sparse kernel density estimate with comparable accuracy to that of the full sample optimized Parzen window density estimate. Our experimental results also demonstrate that the proposed algorithm compares favorably with the SVM method, in terms of both test accuracy and sparsity, for constructing kernel density estimates.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We present a new subcortical structure shape modeling framework using heat kernel smoothing constructed with the Laplace-Beltrami eigenfunctions. The cotan discretization is used to numerically obtain the eigenfunctions of the Laplace-Beltrami operator along the surface of subcortical structures of the brain. The eigenfunctions are then used to construct the heat kernel and used in smoothing out measurements noise along the surface. The proposed framework is applied in investigating the influence of age (38-79 years) and gender on amygdala and hippocampus shape. We detected a significant age effect on hippocampus in accordance with the previous studies. In addition, we also detected a significant gender effect on amygdala. Since we did not find any such differences in the traditional volumetric methods, our results demonstrate the benefit of the current framework over traditional volumetric methods.