783 resultados para Fundamentals of computing theory
Resumo:
We report on the onset of fluid entrainment when a contact line is forced to advance over a dry solid of arbitrary wettability. We show that entrainment occurs at a critical advancing speed beyond which the balance between capillary, viscous, and contact-line forces sustaining the shape of the interface is no longer satisfied. Wetting couples to the hydrodynamics by setting both the morphology of the interface at small scales and the viscous friction of the front. We find that the critical deformation that the interface can sustain is controlled by the friction at the contact line and the viscosity contrast between the displacing and displaced fluids, leading to a rich variety of wetting-entrainment regimes. We discuss the potential use of our theory to measure contact-line forces using atomic force microscopy and to study entrainment under microfluidic conditions exploiting colloid-polymer fluids of ultralow surface tension.
Resumo:
By means of computer simulations and solution of the equations of the mode coupling theory (MCT),we investigate the role of the intramolecular barriers on several dynamic aspects of nonentangled polymers. The investigated dynamic range extends from the caging regime characteristic of glass-formers to the relaxation of the chain Rouse modes. We review our recent work on this question,provide new results, and critically discuss the limitations of the theory. Solutions of the MCT for the structural relaxation reproduce qualitative trends of simulations for weak and moderate barriers. However, a progressive discrepancy is revealed as the limit of stiff chains is approached. This dis-agreement does not seem related with dynamic heterogeneities, which indeed are not enhanced by increasing barrier strength. It is not connected either with the breakdown of the convolution approximation for three-point static correlations, which retains its validity for stiff chains. These findings suggest the need of an improvement of the MCT equations for polymer melts. Concerning the relaxation of the chain degrees of freedom, MCT provides a microscopic basis for time scales from chain reorientation down to the caging regime. It rationalizes, from first principles, the observed deviations from the Rouse model on increasing the barrier strength. These include anomalous scaling of relaxation times, long-time plateaux, and nonmonotonous wavelength dependence of the mode correlators.
Resumo:
The theory of language has occupied a special place in the history of Indian thought. Indian philosophers give particular attention to the analysis of the cognition obtained from language, known under the generic name of śābdabodha. This term is used to denote, among other things, the cognition episode of the hearer, the content of which is described in the form of a paraphrase of a sentence represented as a hierarchical structure. Philosophers submit the meaning of the component items of a sentence and their relationship to a thorough examination, and represent the content of the resulting cognition as a paraphrase centred on a meaning element, that is taken as principal qualificand (mukhyaviśesya) which is qualified by the other meaning elements. This analysis is the object of continuous debate over a period of more than a thousand years between the philosophers of the schools of Mimāmsā, Nyāya (mainly in its Navya form) and Vyākarana. While these philosophers are in complete agreement on the idea that the cognition of sentence meaning has a hierarchical structure and share the concept of a single principal qualificand (qualified by other meaning elements), they strongly disagree on the question which meaning element has this role and by which morphological item it is expressed. This disagreement is the central point of their debate and gives rise to competing versions of this theory. The Mïmāmsakas argue that the principal qualificand is what they call bhāvanā ̒bringing into being̒, ̒efficient force̒ or ̒productive operation̒, expressed by the verbal affix, and distinct from the specific procedures signified by the verbal root; the Naiyāyikas generally take it to be the meaning of the word with the first case ending, while the Vaiyākaranas take it to be the operation expressed by the verbal root. All the participants rely on the Pāninian grammar, insofar as the Mimāmsakas and Naiyāyikas do not compose a new grammar of Sanskrit, but use different interpretive strategies in order to justify their views, that are often in overt contradiction with the interpretation of the Pāninian rules accepted by the Vaiyākaranas. In each of the three positions, weakness in one area is compensated by strength in another, and the cumulative force of the total argumentation shows that no position can be declared as correct or overall superior to the others. This book is an attempt to understand this debate, and to show that, to make full sense of the irreconcilable positions of the three schools, one must go beyond linguistic factors and consider the very beginnings of each school's concern with the issue under scrutiny. The texts, and particularly the late texts of each school present very complex versions of the theory, yet the key to understanding why these positions remain irreconcilable seems to lie elsewhere, this in spite of extensive argumentation involving a great deal of linguistic and logical technicalities. Historically, this theory arises in Mimāmsā (with Sabara and Kumārila), then in Nyāya (with Udayana), in a doctrinal and theological context, as a byproduct of the debate over Vedic authority. The Navya-Vaiyākaranas enter this debate last (with Bhattoji Dïksita and Kaunda Bhatta), with the declared aim of refuting the arguments of the Mïmāmsakas and Naiyāyikas by bringing to light the shortcomings in their understanding of Pāninian grammar. The central argument has focused on the capacity of the initial contexts, with the network of issues to which the principal qualificand theory is connected, to render intelligible the presuppositions and aims behind the complex linguistic justification of the classical and late stages of this debate. Reading the debate in this light not only reveals the rationality and internal coherence of each position beyond the linguistic arguments, but makes it possible to understand why the thinkers of the three schools have continued to hold on to three mutually exclusive positions. They are defending not only their version of the principal qualificand theory, but (though not openly acknowledged) the entire network of arguments, linguistic and/or extra-linguistic, to which this theory is connected, as well as the presuppositions and aims underlying these arguments.
Resumo:
Tietokonejärjestelmän osien ja ohjelmistojen suorituskykymittauksista saadaan tietoa,jota voidaan käyttää suorituskyvyn parantamiseen ja laitteistohankintojen päätöksen tukena. Tässä työssä tutustutaan suorituskyvyn mittaamiseen ja mittausohjelmiin eli ns. benchmark-ohjelmistoihin. Työssä etsittiin ja arvioitiin eri tyyppisiä vapaasti saatavilla olevia benchmark-ohjelmia, jotka soveltuvat Linux-laskentaklusterin suorituskyvynanalysointiin. Benchmarkit ryhmiteltiin ja arvioitiin testaamalla niiden ominaisuuksia Linux-klusterissa. Työssä käsitellään myös mittausten tekemisen ja rinnakkaislaskennan haasteita. Benchmarkkeja löytyi moneen tarkoitukseen ja ne osoittautuivat laadultaan ja laajuudeltaan vaihteleviksi. Niitä on myös koottu ohjelmistopaketeiksi, jotta laitteiston suorituskyvystä saisi laajemman kuvan kuin mitä yhdellä ohjelmalla on mahdollista saada. Olennaista on ymmärtää nopeus, jolla dataa saadaan siirretyä prosessorille keskusmuistista, levyjärjestelmistä ja toisista laskentasolmuista. Tyypillinen benchmark-ohjelma sisältää paljon laskentaa tarvitsevan matemaattisen algoritmin, jota käytetään tieteellisissä ohjelmistoissa. Benchmarkista riippuen tulosten ymmärtäminen ja hyödyntäminen voi olla haasteellista.
Resumo:
The effect of the heat flux on the rate of chemical reaction in dilute gases is shown to be important for reactions characterized by high activation energies and in the presence of very large temperature gradients. This effect, obtained from the second-order terms in the distribution function (similar to those obtained in the Burnett approximation to the solution of the Boltzmann equation), is derived on the basis of information theory. It is shown that the analytical results describing the effect are simpler if the kinetic definition for the nonequilibrium temperature is introduced than if the thermodynamic definition is introduced. The numerical results are nearly the same for both definitions
Resumo:
In his version of the theory of multicomponent systems, Friedman used the analogy which exists between the virial expansion for the osmotic pressure obtained from the McMillan-Mayer (MM) theory of solutions in the grand canonical ensemble and the virial expansion for the pressure of a real gas. For the calculation of the thermodynamic properties of the solution, Friedman proposed a definition for the"excess free energy" that is a reminder of the ancient idea for the"osmotic work". However, the precise meaning to be attached to his free energy is, within other reasons, not well defined because in osmotic equilibrium the solution is not a closed system and for a given process the total amount of solvent in the solution varies. In this paper, an analysis based on thermodynamics is presented in order to obtain the exact and precise definition for Friedman"s excess free energy and its use in the comparison with the experimental data.
Resumo:
By means of computer simulations and solution of the equations of the mode coupling theory (MCT),we investigate the role of the intramolecular barriers on several dynamic aspects of nonentangled polymers. The investigated dynamic range extends from the caging regime characteristic of glass-formers to the relaxation of the chain Rouse modes. We review our recent work on this question,provide new results, and critically discuss the limitations of the theory. Solutions of the MCT for the structural relaxation reproduce qualitative trends of simulations for weak and moderate barriers. However, a progressive discrepancy is revealed as the limit of stiff chains is approached. This dis-agreement does not seem related with dynamic heterogeneities, which indeed are not enhanced by increasing barrier strength. It is not connected either with the breakdown of the convolution approximation for three-point static correlations, which retains its validity for stiff chains. These findings suggest the need of an improvement of the MCT equations for polymer melts. Concerning the relaxation of the chain degrees of freedom, MCT provides a microscopic basis for time scales from chain reorientation down to the caging regime. It rationalizes, from first principles, the observed deviations from the Rouse model on increasing the barrier strength. These include anomalous scaling of relaxation times, long-time plateaux, and nonmonotonous wavelength dependence of the mode correlators.
Resumo:
Ultrasound image reconstruction from the echoes received by an ultrasound probe after the transmission of diverging waves is an active area of research because of its capacity to insonify at ultra-high frame rate with large regions of interest using small phased arrays as the ones used in echocardiography. Current state-of-the-art techniques are based on the emission of diverging waves and the use of delay and sum strategies applied on the received signals to reconstruct the desired image (DW/DAS). Recently, we have introduced the concept of Ultrasound Fourier Slice Imaging (UFSI) theory for the reconstruction of ultrafast imaging for linear acquisition. In this study, we extend this theory to sectorial acquisition thanks to the introduction of an explicit and invertible spatial transform. Starting from a diverging wave, we show that the direct use of UFSI theory along with the application of the proposed spatial transform allows reconstructing the insonified medium in the conventional Cartesian space. Simulations and experiments reveal the capacity of this new approach in obtaining competitive quality of ultrafast imaging when compared with the current reference method.
Resumo:
New economic and enterprise needs have increased the interest and utility of the methods of the grouping process based on the theory of uncertainty. A fuzzy grouping (clustering) process is a key phase of knowledge acquisition and reduction complexity regarding different groups of objects. Here, we considered some elements of the theory of affinities and uncertain pretopology that form a significant support tool for a fuzzy clustering process. A Galois lattice is introduced in order to provide a clearer vision of the results. We made an homogeneous grouping process of the economic regions of Russian Federation and Ukraine. The obtained results gave us a large panorama of a regional economic situation of two countries as well as the key guidelines for the decision-making. The mathematical method is very sensible to any changes the regional economy can have. We gave an alternative method of the grouping process under uncertainty.
Resumo:
Scientific studies regarding specifically references do not seem to exist. However, the utilization of references is an important practice for many companies involved in industrial marketing. The purpose of the study is to increase the understanding about the utilization of references in international industrial marketing in order to contribute to the development of a theory of reference behavior. Specifically, the modes of reference usage in industry, the factors affecting a supplier's reference behavior, and the question how references are actually utilized, are explored in the study. Due to the explorative nature of the study, a research design was followed where theory and empirical studies alternated. An Exploratory Framework was developed to guide a pilot case study that resulted in Framework 1. Results of the pilot study guided an expanded literature review that was used to develop first a Structural Framework and a Process Framework which were combined in Framework 2. Then, the second empirical phase of the case study was conducted in the same (pilot) case company. In this phase, Decision Systems Analysis (DSA) was used as the analysis method. The DSA procedure consists of three interviewing waves: initial interviews, reinterviews, and validating interviews. Four reference decision processes were identified, described and analyzed in the form of flowchart descriptions. The flowchart descriptions were used to explore new constructs and to develop new propositions to develop Framework 2 further. The quality of the study was ascertained by many actions in both empirical parts of the study. The construct validity of the study was ascertained by using multiple sources of evidence and by asking the key informant to review the pilot case report. The DSA method itself includes procedures assuring validity. Because of the choice to conduct a single case study, external validity was not even pursued. High reliability was pursued through detailed documentation and thorough reporting of evidence. It was concluded that the core of the concept of reference is a customer relationship regardless of the concrete forms a reference might take in its utilization. Depending on various contingencies, references might have various tasks inside the four roles of increasing 1) efficiency of sales and sales management, 2) efficiency of the business, 3) effectiveness of marketing activities, and 4) effectiveness in establishing, maintaining and enhancing customer relationships. Thus, references have not only external but internal tasks as well. A supplier's reference behavior might be affected by many hierarchical conditions. Additionally, the empirical study showed that the supplier can utilize its references as a continuous, all pervasive decision making process through various practices. The process includes both individual and unstructured decision making subprocesses. The proposed concept of reference can be used to guide a reference policy recommendable for companies for which the utilization of references is important. The significance of the study is threefold: proposing the concept of reference, developing a framework of a supplier's reference behavior and its short term process of utilizing references, and conceptual structuring of an unstructured and in industrial marketing important phenomenon to four roles.
Resumo:
In this paper, a systematic and quantitative view is presented for the application of the theory of constraints in manufacturing. This is done employing the operational research technique of mathematical programming. The potential of the theory of constraints in automated manufacturing is demonstrated.
Resumo:
Dignity is seen important in health care context but considered as a controversial and complex concept. In health care context, it is described as being influenced by for example autonomy, respect, communication, privacy and hospital environment. Patient dignity is related to satisfaction with care, reduced stress, better confidence in health services, enhanced patient outcomes and shorter stay in a hospital. Stroke patients may struggle for dignity as being dependent on other people has impact on the patients’ self-image. In all, stroke patients are very specific patient group and considered vulnerable from emotional aspect. Therefore study findings from other patient groups in the area of ethical problems cannot be transferred to the stroke patients. This master’s thesis consists of two parts. The first part is the literature review of patients’ dignity in hospital care. The literature defined dignity and described factors promoting and reducing it. The results were ambiguous and thus a clear understanding was not able to create. That was the basis for the second part of the master’s thesis, the empirical study. This part aimed to develop theoretical construction to explore the realization of stroke patients’ dignity in hospital care. The data of the second part was collected by interviewing 16 stroke patients and analyzed using the constant comparison of Grounded Theory. The result was ‘The Theory of Realization of Stroke Patients’ Dignity in Hospital Care’ which is described not only in this master’s thesis but also as a scientific article. The theory consists of the core category, four generic elements and five specific types on realization. The core category emerged as ‘dignity in a new situation’. After a stroke, dignity is defined in a new way which is influenced by the generic elements: life history, health history, individuality and a stroke. Stroke patient’s dignity is realized through five specific types on realization: person related dignity type, control related dignity type, independence related dignity type, social related dignity type and care related dignity type. The theory points out possible special characteristics of stroke patients’ dignity in control related dignity type and independence related dignity type. Before implementing the theory, the relation between the core category, generic elements and specific types on realization needs to be studied further.
Resumo:
The objectives of this master’s thesis were to understand the importance of bubbling fluidized bed (BFB) conditions and to find out how digital image processing and acoustic emission technology can help in monitoring the bed quality. An acoustic emission (AE) measurement system and a bottom ash camera system were evaluated in acquiring information about the bed conditions. The theory part of the study describes the fundamentals of BFB boiler and evaluates the characteristics of bubbling bed. Causes and effects of bed material coarsening are explained. The ways and methods to monitor the behaviour of BFB are determined. The study introduces the operating principles of AE technology and digital image processing. The empirical part of the study describes an experimental arrangement and results of a case study at an industrial BFB boiler. Sand consumption of the boiler was reduced by optimization of bottom ash handling and sand feeding. Furthermore, data from the AE measurement system and the bottom ash camera system was collected. The feasibility of these two systems was evaluated. The particle size of bottom ash and the changes in particle size distribution were monitored during the test period. Neither of the systems evaluated was ready to serve in bed quality control accurately or fast enough. Particle size distributions according to the bottom ash camera did not correspond to the results of manual sieving. Comprehensive interpretation of the collected AE data requires much experience. Both technologies do have potential and with more research and development they may enable acquiring reliable and real-time information about the bed conditions. This information could help to maintain disturbance-free combustion process and to optimize bottom ash handling system.
Resumo:
We have calculated the thermodynamic properties of monatomic fcc crystals from the high temperature limit of the Helmholtz free energy. This equation of state included the static and vibrational energy components. The latter contribution was calculated to order A4 of perturbation theory, for a range of crystal volumes, in which a nearest neighbour central force model was used. We have calculated the lattice constant, the coefficient of volume expansion, the specific heat at constant volume and at constant pressure, the adiabatic and the isothermal bulk modulus, and the Gruneisen parameter, for two of the rare gas solids, Xe and Kr, and for the fcc metals Cu, Ag, Au, Al, and Pb. The LennardJones and the Morse potential were each used to represent the atomic interactions for the rare gas solids, and only the Morse potential was used for the fcc metals. The thermodynamic properties obtained from the A4 equation of state with the Lennard-Jones potential, seem to be in reasonable agreement with experiment for temperatures up to about threequarters of the melting temperature. However, for the higher temperatures, the results are less than satisfactory. For Xe and Kr, the thermodynamic properties calculated from the A2 equation of state with the Morse potential, are qualitatively similar to the A 2 results obtained with the Lennard-Jones potential, however, the properties obtained from the A4 equation of state are in good agreement with experiment, since the contribution from the A4 terms seem to be small. The lattice contribution to the thermal properties of the fcc metals was calculated from the A4 equation of state, and these results produced a slight improvement over the properties calculated from the A2 equation of state. In order to compare the calculated specific heats and bulk moduli results with experiment~ the electronic contribution to thermal properties was taken into account~ by using the free electron model. We found that the results varied significantly with the value chosen for the number of free electrons per atom.
Resumo:
We provide an algorithm that automatically derives many provable theorems in the equational theory of allegories. This was accomplished by noticing properties of an existing decision algorithm that could be extended to provide a derivation in addition to a decision certificate. We also suggest improvements and corrections to previous research in order to motivate further work on a complete derivation mechanism. The results presented here are significant for those interested in relational theories, since we essentially have a subtheory where automatic proof-generation is possible. This is also relevant to program verification since relations are well-suited to describe the behaviour of computer programs. It is likely that extensions of the theory of allegories are also decidable and possibly suitable for further expansions of the algorithm presented here.