15 resultados para Generalized spike-and-wave discharges

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study is to analyse the content of the interdisciplinary conversations in Göttingen between 1949 and 1961. The task is to compare models for describing reality presented by quantum physicists and theologians. Descriptions of reality indifferent disciplines are conditioned by the development of the concept of reality in philosophy, physics and theology. Our basic problem is stated in the question: How is it possible for the intramental image to match the external object?Cartesian knowledge presupposes clear and distinct ideas in the mind prior to observation resulting in a true correspondence between the observed object and the cogitative observing subject. The Kantian synthesis between rationalism and empiricism emphasises an extended character of representation. The human mind is not a passive receiver of external information, but is actively construing intramental representations of external reality in the epistemological process. Heidegger's aim was to reach a more primordial mode of understanding reality than what is possible in the Cartesian Subject-Object distinction. In Heidegger's philosophy, ontology as being-in-the-world is prior to knowledge concerning being. Ontology can be grasped only in the totality of being (Dasein), not only as an object of reflection and perception. According to Bohr, quantum mechanics introduces an irreducible loss in representation, which classically understood is a deficiency in knowledge. The conflicting aspects (particle and wave pictures) in our comprehension of physical reality, cannot be completely accommodated into an entire and coherent model of reality. What Bohr rejects is not realism, but the classical Einsteinian version of it. By the use of complementary descriptions, Bohr tries to save a fundamentally realistic position. The fundamental question in Barthian theology is the problem of God as an object of theological discourse. Dialectics is Barth¿s way to express knowledge of God avoiding a speculative theology and a human-centred religious self-consciousness. In Barthian theology, the human capacity for knowledge, independently of revelation, is insufficient to comprehend the being of God. Our knowledge of God is real knowledge in revelation and our words are made to correspond with the divine reality in an analogy of faith. The point of the Bultmannian demythologising programme was to claim the real existence of God beyond our faculties. We cannot simply define God as a human ideal of existence or a focus of values. The theological programme of Bultmann emphasised the notion that we can talk meaningfully of God only insofar as we have existential experience of his intervention. Common to all these twentieth century philosophical, physical and theological positions, is a form of anti-Cartesianism. Consequently, in regard to their epistemology, they can be labelled antirealist. This common insight also made it possible to find a common meeting point between the different disciplines. In this study, the different standpoints from all three areas and the conversations in Göttingen are analysed in the frameworkof realism/antirealism. One of the first tasks in the Göttingen conversations was to analyse the nature of the likeness between the complementary structures inquantum physics introduced by Niels Bohr and the dialectical forms in the Barthian doctrine of God. The reaction against epistemological Cartesianism, metaphysics of substance and deterministic description of reality was the common point of departure for theologians and physicists in the Göttingen discussions. In his complementarity, Bohr anticipated the crossing of traditional epistemic boundaries and the generalisation of epistemological strategies by introducing interpretative procedures across various disciplines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Negative refractive index materials and propagation of electromagnetic waves in them started to draw attention of scientists not so long ago. This review highlights historically important and recent papers on practical and theoretical aspects related to these issues. Namely, basic properties and peculiarities of such materials related to both their design and wave propagation in them, experimental verification of predictions theoretically made for them, possible practical applications and prospects in this area are considered.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis studies the properties and usability of operators called t-norms, t-conorms, uninorms, as well as many valued implications and equivalences. Into these operators, weights and a generalized mean are embedded for aggregation, and they are used for comparison tasks and for this reason they are referred to as comparison measures. The thesis illustrates how these operators can be weighted with a differential evolution and aggregated with a generalized mean, and the kinds of measures of comparison that can be achieved from this procedure. New operators suitable for comparison measures are suggested. These operators are combination measures based on the use of t-norms and t-conorms, the generalized 3_-uninorm and pseudo equivalence measures based on S-type implications. The empirical part of this thesis demonstrates how these new comparison measures work in the field of classification, for example, in the classification of medical data. The second application area is from the field of sports medicine and it represents an expert system for defining an athlete's aerobic and anaerobic thresholds. The core of this thesis offers definitions for comparison measures and illustrates that there is no actual difference in the results achieved in comparison tasks, by the use of comparison measures based on distance, versus comparison measures based on many valued logical structures. The approach has been highly practical in this thesis and all usage of the measures has been validated mainly by practical testing. In general, many different types of operators suitable for comparison tasks have been presented in fuzzy logic literature and there has been little or no experimental work with these operators.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The uncertainty of any analytical determination depends on analysis and sampling. Uncertainty arising from sampling is usually not controlled and methods for its evaluation are still little known. Pierre Gy’s sampling theory is currently the most complete theory about samplingwhich also takes the design of the sampling equipment into account. Guides dealing with the practical issues of sampling also exist, published by international organizations such as EURACHEM, IUPAC (International Union of Pure and Applied Chemistry) and ISO (International Organization for Standardization). In this work Gy’s sampling theory was applied to several cases, including the analysis of chromite concentration estimated on SEM (Scanning Electron Microscope) images and estimation of the total uncertainty of a drug dissolution procedure. The results clearly show that Gy’s sampling theory can be utilized in both of the above-mentioned cases and that the uncertainties achieved are reliable. Variographic experiments introduced in Gy’s sampling theory are beneficially applied in analyzing the uncertainty of auto-correlated data sets such as industrial process data and environmental discharges. The periodic behaviour of these kinds of processes can be observed by variographic analysis as well as with fast Fourier transformation and auto-correlation functions. With variographic analysis, the uncertainties are estimated as a function of the sampling interval. This is advantageous when environmental data or process data are analyzed as it can be easily estimated how the sampling interval is affecting the overall uncertainty. If the sampling frequency is too high, unnecessary resources will be used. On the other hand, if a frequency is too low, the uncertainty of the determination may be unacceptably high. Variographic methods can also be utilized to estimate the uncertainty of spectral data produced by modern instruments. Since spectral data are multivariate, methods such as Principal Component Analysis (PCA) are needed when the data are analyzed. Optimization of a sampling plan increases the reliability of the analytical process which might at the end have beneficial effects on the economics of chemical analysis,

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The acceleration of solar energetic particles (SEPs) by flares and coronal mass ejections (CMEs) has been a major topic of research for the solar-terrestrial physics and geophysics communities for decades. This thesis discusses theories describing first-order Fermi acceleration of SEPs through repeated crossings at a CME-driven shock. We propose that particle trapping occurs through self-generated Alfvén waves, leading to a turbulent trapping region in front of the shock. Decelerating coronal shocks are shown to be capable of efficient SEP acceleration, provided seed particle injection is sufficient. Quasi-parallel shocks are found to inject thermal particles with good efficiency. The roles of minimum injection velocities, cross-field diffusion, downstream scattering efficiency and cross-shock potential are investigated in detail, with downstream isotropisation timescales having a major effect on injection efficiency. Accelerated spectra of heavier elements up to iron are found to exhibit significantly harder spectra than protons. Accelerated spectra cut-off energies are found to scale proportional to (Q/A)1.5, which is explained through analysis of the spectral shape of amplified Alfvénic turbulence. Acceleration times to different threshold energies are found to be non-linear, indicating that self-consistent time-dependent simulations are required in order to expose the full extent of acceleration dynamics. The well-established quasilinear theory (QLT) of particle scattering is investigated by comparing QLT scattering coefficients with those found via full-orbit simulations. QLT is found to overemphasise resonance conditions. This finding supports the simplifications implemented in the presented coronal shock acceleration (CSA) simulation software. The CSA software package is used to simulate a range of acceleration scenarios. The results are found to be in agreement with well-established particle acceleration theory. At the same time, new spatial and temporal dynamics of particle population trapping and wave evolution are revealed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A rigorous unit operation model is developed for vapor membrane separation. The new model is able to describe temperature, pressure, and concentration dependent permeation as wellreal fluid effects in vapor and gas separation with hydrocarbon selective rubbery polymeric membranes. The permeation through the membrane is described by a separate treatment of sorption and diffusion within the membrane. The chemical engineering thermodynamics is used to describe the equilibrium sorption of vapors and gases in rubbery membranes with equation of state models for polymeric systems. Also a new modification of the UNIFAC model is proposed for this purpose. Various thermodynamic models are extensively compared in order to verify the models' ability to predict and correlate experimental vapor-liquid equilibrium data. The penetrant transport through the selective layer of the membrane is described with the generalized Maxwell-Stefan equations, which are able to account for thebulk flux contribution as well as the diffusive coupling effect. A method is described to compute and correlate binary penetrant¿membrane diffusion coefficients from the experimental permeability coefficients at different temperatures and pressures. A fluid flow model for spiral-wound modules is derived from the conservation equation of mass, momentum, and energy. The conservation equations are presented in a discretized form by using the control volume approach. A combination of the permeation model and the fluid flow model yields the desired rigorous model for vapor membrane separation. The model is implemented into an inhouse process simulator and so vapor membrane separation may be evaluated as an integralpart of a process flowsheet.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An accidental burst of a pressure vessel is an uncontrollable and explosion-like batch process. In this study it is called an explosion. The destructive effectof a pressure vessel explosion is relative to the amount of energy released in it. However, in the field of pressure vessel safety, a mutual understanding concerning the definition of explosion energy has not yet been achieved. In this study the definition of isentropic exergy is presented. Isentropic exergy is the greatest possible destructive energy which can be obtained from a pressure vessel explosion when its state changes in an isentropic way from the initial to the final state. Finally, after the change process, the gas has similar pressure and flow velocity as the environment. Isentropic exergy differs from common exergy inthat the process is assumed to be isentropic and the final gas temperature usually differs from the ambient temperature. The explosion process is so fast that there is no time for the significant heat exchange needed for the common exergy.Therefore an explosion is better characterized by isentropic exergy. Isentropicexergy is a characteristic of a pressure vessel and it is simple to calculate. Isentropic exergy can be defined also for any thermodynamic system, such as the shock wave system developing around an exploding pressure vessel. At the beginning of the explosion process the shock wave system has the same isentropic exergyas the pressure vessel. When the system expands to the environment, its isentropic exergy decreases because of the increase of entropy in the shock wave. The shock wave system contains the pressure vessel gas and a growing amount of ambient gas. The destructive effect of the shock wave on the ambient structures decreases when its distance from the starting point increases. This arises firstly from the fact that the shock wave system is distributed to a larger space. Secondly, the increase of entropy in the shock waves reduces the amount of isentropic exergy. Equations concerning the change of isentropic exergy in shock waves are derived. By means of isentropic exergy and the known flow theories, equations illustrating the pressure of the shock wave as a function of distance are derived. Amethod is proposed as an application of the equations. The method is applicablefor all shapes of pressure vessels in general use, such as spheres, cylinders and tubes. The results of this method are compared to measurements made by various researchers and to accident reports on pressure vessel explosions. The test measurements are found to be analogous with the proposed method and the findings in the accident reports are not controversial to it.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this Thesis the interaction of an electromagnetic field and matter is studied from various aspects in the general framework of cold atoms. Our subjects cover a wide spectrum of phenomena ranging from semiclassical few-level models to fully quantum mechanical interaction with structured reservoirs leading to non-Markovian open quantum system dynamics. Within closed quantum systems, we propose a selective method to manipulate the motional state of atoms in a time-dependent double-well potential and interpret the method in terms of adiabatic processes. Also, we derive a simple wave-packet model, based on distributions of generalized eigenstates, explaining the finite visibility of interference in overlapping continuous-wave atom lasers. In the context of open quantum systems, we develop an unraveling of non-Markovian dynamics in terms of piecewise deterministic quantum jump processes confined in the Hilbert space of the reduced system - the non-Markovian quantum jump method. As examples, we apply it for simple 2- and 3-level systems interacting with a structured reservoir. Also, in the context of ion-cavity QED we study the entanglement generation based on collective Dicke modes in experimentally realistic conditions including photonic losses and an atomic spontaneous decay.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

US-patentti nro: US 7,908,854 B2

Relevância:

40.00% 40.00%

Publicador:

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Type 2 diabetes patients have a 2-4 fold risk of cardiovascular disease (CVD) compared to the general population. In type 2 diabetes, several CVD risk factors have been identified, including obesity, hypertension, hyperglycemia, proteinuria, sedentary lifestyle and dyslipidemia. Although much of the excess CVD risk can be attributed to these risk factors, a significant proportion is still unknown. Aims: To assess in middle-aged type 2 diabetic subjects the joint relations of several conventional and non-conventional CVD risk factors with respect to cardiovascular and total mortality. Subjects and methods: This thesis is part of a large prospective, population based East-West type 2 diabetes study that was launched in 1982-1984. It includes 1,059 middle-aged (45-64 years old) participants. At baseline, a thorough clinical examination and laboratory measurements were performed and an ECG was recorded. The latest follow-up study was performed 18 years later in January 2001 (when the subjects were 63-81 years old). The study endpoints were total mortality and mortality due to CVD, coronary heart disease (CHD) and stroke. Results: Physically more active patients had significantly reduced total, CVD and CHD mortality independent of high-sensitivity C-reactive protein (hs-CRP) levels unless proteinuria was present. Among physically active patients with a hs-CRP level >3 mg/L, the prognosis of CVD mortality was similar to patients with hs-CRP levels ≤3 mg/L. The worst prognosis was among physically inactive patients with hs-CRP levels >3 mg/L. Physically active patients with proteinuria had significantly increased total and CVD mortality by multivariate analyses. After adjustment for confounding factors, patients with proteinuria and a systolic BP <130 mmHg had a significant increase in total and CVD mortality compared to those with a systolic BP between 130 and 160 mmHg. The prognosis was similar in patients with a systolic BP <130 mmHg and ≥160 mmHg. Among patients without proteinuria, a systolic BP <130 mmHg was associated with a non-significant reduction in mortality. A P wave duration ≥114 ms was associated with a 2.5-fold increase in stroke mortality among patients with prevalent CHD or claudication. This finding persisted in multivariable analyses. Among patients with no comorbidities, there was no relationship between P wave duration and stroke mortality. Conclusions: Physical activity reduces total and CVD mortality in patients with type 2 diabetes without proteinuria or with elevated levels of hs-CRP, suggesting that the anti-inflammatory effect of physical activity can counteract increased CVD morbidity and mortality associated with a high CRP level. In patients with proteinuria the protective effect was not, however, present. Among patients with proteinuria, systolic BP <130 mmHg may increase mortality due to CVD. These results demonstrate the importance of early intervention to prevent CVD and to control all-cause mortality among patients with type 2 diabetes. The presence of proteinuria should be taken into account when defining the target systolic BP level for prevention of CVD deaths. A prolongation of the duration of the P wave was associated with increased stroke mortality among high-risk patients with type 2 diabetes. P wave duration is easy to measure and merits further examination to evaluate its importance for estimation of the risk of stroke among patients with type 2 diabetes.