860 resultados para Nonsmooth Critical Point Theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Molecular weight dependence of phase separation behavior of the Poly (ethylene oxide) (PEO)/Poly(ethylene oxide-block-dimethylsiloxane) (P(EO-b-DMS)) blends was investigated by both experimental and theoretical methods. The cloud point curves of PEO/P(EO-b-DMS) blends were obtained by turbidity method. Based on Sanchez-Lacombe lattice fluid theory (SLLFT), the adjustable parameter, epsilon*(12)/k (quantifying the interaction energy between different components), was evaluated by fitting the experimental data in phase diagrams. To calculate the spinodals, binodals, and the volume changes of mixing for these blends, three modified combining rules of the scaling parameters for the block copolymer were introduced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conformational analysis of 2,2'-bithiophene (BT) under the influence of an electric field (EF) constructed by point charges has been performed by using semi-empirical Austin Model 1 (AM1) and Parametric model number 3 (PM3) calculations. When the EF perpendicular to the molecular conjugation chain is applied, both AM1 and PM3 calculations show an energy increase of the anti-conformation. AM1 predicts that the global minimum shifts to syn-conformation when the EF strength is larger than a critical value. and PM predicts that the local minimum in anti-conformation vanishes. This kind of EF effect has been ascribed to the EF and dipole moment interaction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the aid of Sanchez-Lacombe lattice fluid theory (SLLFT), the phase diagrams were calculated for the system cyclohexane (CH)/polystyrene (PS) with different molecular weights at different pressures. The experimental data is in reasonable agreement with SLLFT calculations. The total Gibbs interaction energy, g*(12) for different molecular weights PS at different pressures was expressed, by means of a universal relationship, as g(12)* =f(12)* + (P - P-0) nu*(12) demixing curves were then calculated at fixed (near critical) compositions of CH and PS systems for different molecular weights. The pressures of optimum miscibility obtained from the Gibbs interaction energy are close to those measured by Wolf and coworkers. Furthermore, a reasonable explanation was given for the earlier observation of Saeki et al., i.e., the phase separation temperatures of the present system increase with the increase of pressure for the low molecular weight of the polymer whereas they decrease for the higher molecular weight polymers. The effects of molecular weight, pressure, temperature and composition on the Flory Huggins interaction parameter can be described by a general equation resulting from fitting the interaction parameters by means of Sanchez-Lacombe lattice fluid theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Polymer blends of poly(methyl methacrylate) (PMMA) and poly(styrene-co-acrylonitrile) (SAN) with an acrylonitrile content of about 30 wt % were prepared by means of solution-casting and characterized by virtue of pressure-volume-temperature (PVT) dilatometry. The Sanchez-Lacombe (SL) lattice fluid theory was used to calculate the spinodals, the binodals, the Flory-Huggins (FH) interaction parameter, the enthalpy of the mixing, the volume change of the mixing, and the combinatorial and vacancy entropies of the mixing for the PMMA/SAN system. A new volume-combining rule was used to evaluate the close-packed volume per mer, upsilon*, of the PMMA/SAN blends. The calculated results showed that the new and the original volume-combining rules had a slight influence on the FH interaction parameter, the enthalpy of the mixing, and the combinatorial entropy of the mixing. Moreover, the spinodals and the binodals calculated with the SL theory by means of the new volume-combining rule could coincide with the measured data for the PMMA/SAN system with a lower critical solution temperature, whereas those obtained by means of the original one could not.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effects of physical ageing on the crazing of polyphenylquinoxaline (PPQ-E) films were studied. The DSC endothermic peak at the glass transition region of the samples was interpreted in terms of the cohesional entanglement theory. The free volume cavity size and free volume intensity of the samples were characterized by positron annihilation life spectroscopy. The difference in free volume cavity size and free volume intensity between two samples reflect the strength and density of cohensional entanglement point. The critical strain for craze initiation and craze stability depended on physical ageing of the samples. The relationships between physical ageing and crazing were interpreted initially.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cloud-point curves reported for the system polyethersulfone (PES)/phenoxy were calculated by means of the Sanchez-Lacombe (SL) lattice fluid theory. The one adjustable parameter epsilon(12)*/k (quantifying the interaction energy between mers of the different components) can be evaluated by comparison of the theoretical and experimental phase diagrams. The Flory-Huggins (FH) interaction parameters are computed based on the evaluated epsilon(12)*/k and are approximately a linear function of volume fraction and of inverse temperature. The calculated enthalpies of mixing of PES/phenoxy blends for different compositions are consistent with the experimental values obtained previously by Singh and Walsh [1].

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The modeling of petroleum flow path (petroleum charging) and the detail of corresponding software development are presented in this paper, containing principle of petroleum charging, quantitative method, and practical modeling in two oil fields. The Modeling of Petroleum Flow Path is based on the result of basin modeling, according to the principle of petroleum migrating along the shortest path from the source to trap, Petroleum System Dynamics (Prof. Wu Chonglong, 1998), the concept of Petroleum Migration and Dynamic Accumulation (Zhou Donyan, Li Honhui, 2002), etc. The simulation is done combing with all parameters of basin, and considering the flow potential, non-uniformity of source and porous layer. It's the extending of basin modeling, but not belong to it. It is a powerful simulating tool of petroleum system, and can express quantitatively every kind of geology elements of a petroleum basin, and can recuperate dynamically the geology processes with 3D graphics. At result, we can give a result that the petroleum flow shows itself the phenomena of main path, and without using the special theory such as deflection flow in fractures(Tian Kaiming, 1989, 1994, Zhang Fawang, Hou Xingwei, 1998), and flow potential(England, 1987). The contour map of petroleum flow quantitative show clearly where the coteau - dividing slot is, and which convergence region are the main flow path of petroleum, and where is the favorable play of petroleum. The farsighted trap can be determined if there are enough information about structural diagram and can be evaluated, such as the entrapment extent, spill point, area, oil column thickness, etc. Making full use of the result of basin modeling with this new tool, the critical moment and scheme of the petroleum generation and expulsion can be showed clearly. It's powerful analysis tool for geologist.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the approximate high-frequency asymptotic methods to solve the scalar wave equation, we can get the eikonal equation and transport equation. Solving the eikonal equation by the method of characteristics provides a mathematical derivation of ray tracing equations. So, the ray tracing system is folly based on the approximate high-frequency asymptotic methods. If the eikonal is complex, more strictly, the eikonal is real value at the rays and complex outside rays, we can derive the Gaussian beam. This article mainly concentrates on the theory of Gaussian beam. To classical ray tracing theory, the Gaussina beam method (GBM) has many advantages. First, rays are no longer required to stop at the exact position of the receivers; thus time-consuming two-point ray tracing can be avoided. Second, the GBM yields stable results in regions of the wavefield where the standard ray theory fails (e.g., caustics, shadows zones and critical distance). Third, unlike seismograms computed by conventional ray tracing techniques, the GBM synthetic data are less influenced by minor details in the model representation. Here, I realize kinematical and dynamical system, and based on this, realize the GBM. Also, I give some mathematical examples. From these examples, we can find the importance and feasibility of the ray tracing system. Besides, I've studied about the reflection coefficient of inhomogeneous S-electromagnetic wave at the interface of conductive media. Basing on the difference of directions of phase shift constant and attenuation constant when the electromagnetic wave propagates in conductive medium, and using the boundary conditions of electromagnetic wave at the interface of conductive media, we derive the reflection coefficient of inhomogeneous S-electromagnetic wave, and draw the curves of it. The curves show that the quasi total reflection will occur when the electromagnetic wave incident from the medium with greater conductivity to the medium with smaller conductivity. There are two peak, values at the points of the critical angles of phase shift constant and attenuation constant, and the reflection coefficient is smaller than 1. This conclusion is different from that of total reflection light obviously.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Learning an input-output mapping from a set of examples, of the type that many neural networks have been constructed to perform, can be regarded as synthesizing an approximation of a multi-dimensional function, that is solving the problem of hypersurface reconstruction. From this point of view, this form of learning is closely related to classical approximation techniques, such as generalized splines and regularization theory. This paper considers the problems of an exact representation and, in more detail, of the approximation of linear and nolinear mappings in terms of simpler functions of fewer variables. Kolmogorov's theorem concerning the representation of functions of several variables in terms of functions of one variable turns out to be almost irrelevant in the context of networks for learning. We develop a theoretical framework for approximation based on regularization techniques that leads to a class of three-layer networks that we call Generalized Radial Basis Functions (GRBF), since they are mathematically related to the well-known Radial Basis Functions, mainly used for strict interpolation tasks. GRBF networks are not only equivalent to generalized splines, but are also closely related to pattern recognition methods such as Parzen windows and potential functions and to several neural network algorithms, such as Kanerva's associative memory, backpropagation and Kohonen's topology preserving map. They also have an interesting interpretation in terms of prototypes that are synthesized and optimally combined during the learning stage. The paper introduces several extensions and applications of the technique and discusses intriguing analogies with neurobiological data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Learning an input-output mapping from a set of examples can be regarded as synthesizing an approximation of a multi-dimensional function. From this point of view, this form of learning is closely related to regularization theory. In this note, we extend the theory by introducing ways of dealing with two aspects of learning: learning in the presence of unreliable examples and learning from positive and negative examples. The first extension corresponds to dealing with outliers among the sparse data. The second one corresponds to exploiting information about points or regions in the range of the function that are forbidden.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report describes the implementation of a theory of edge detection, proposed by Marr and Hildreth (1979). According to this theory, the image is first processed independently through a set of different size filters, whose shape is the Laplacian of a Gaussian, ***. Zero-crossings in the output of these filters mark the positions of intensity changes at different resolutions. Information about these zero-crossings is then used for deriving a full symbolic description of changes in intensity in the image, called the raw primal sketch. The theory is closely tied with early processing in the human visual systems. In this report, we first examine the critical properties of the initial filters used in the edge detection process, both from a theoretical and practical standpoint. The implementation is then used as a test bed for exploring aspects of the human visual system; in particular, acuity and hyperacuity. Finally, we present some preliminary results concerning the relationship between zero-crossings detected at different resolutions, and some observations relevant to the process by which the human visual system integrates descriptions of intensity changes obtained at different resolutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The latest buzz phrase to enter the world of design research is “Design Thinking”. But is this anything new and does it really have any practical or theoretical relevance to the design world? Many sceptics believe the term has more to do with business strategy and little to do with the complex process of designing products, services and systems. Moreover, many view the term as misleading and a cheap attempt to piggyback the world of business management onto design. This paper seeks to ask is design thinking anything new? Several authors have explicitly or implicitly articulated the term “Design Thinking” before, such as Peter Rowe’s seminal book “Design Thinking” [1] first published in 1987 and Herbert Simon’s “The Sciences of the Artificial” [2] first published in 1969. In Tim Brown’s “Change by Design” [3], design thinking is thought of as a system of three overlapping spaces rather than a sequence of orderly steps namely inspiration – the problem or opportunity that motivates the search for solutions; ideation – the process of generating, developing and testing ideas; and implementation – the path that leads from the design studio, lab and factory to the market. This paper seeks to examine and critically analyse the tenets of this new design thinking manifesto set against three case studies of modern design practice. As such, the paper will compare design thinking theory with the reality of design in practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim and objectives To examine how nurses collect and use cues from respiratory assessment to inform their decisions as they wean patients from ventilatory support. Background Prompt and accurate identification of the patient's ability to sustain reduction of ventilatory support has the potential to increase the likelihood of successful weaning. Nurses' information processing during the weaning from mechanical ventilation has not been well-described. Design A descriptive ethnographic study exploring critical care nurses' decision-making processes when weaning mechanically ventilated patients from ventilatory support in the real setting. Methods Novice and expert Scottish and Greek nurses from two tertiary intensive care units were observed in real practice of weaning mechanical ventilation and were invited to participate in reflective interviews near the end of their shift. Data were analysed thematically using concept maps based on information processing theory. Ethics approval and informed consent were obtained. Results Scottish and Greek critical care nurses acquired patient-centred objective physiological and subjective information from respiratory assessment and previous knowledge of the patient, which they clustered around seven concepts descriptive of the patient's ability to wean. Less experienced nurses required more encounters of cues to attain the concepts with certainty. Subjective criteria were intuitively derived from previous knowledge of patients' responses to changes of ventilatory support. All nurses used focusing decision-making strategies to select and group cues in order to categorise information with certainty and reduce the mental strain of the decision task. Conclusions Nurses used patient-centred information to make a judgment about the patients' ability to wean. Decision-making strategies that involve categorisation of patient-centred information can be taught in bespoke educational programmes for mechanical ventilation and weaning. Relevance to clinical practice Advanced clinical reasoning skills and accurate detection of cues in respiratory assessment by critical care nurses will ensure optimum patient management in weaning mechanical ventilation

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gibbs, N., Getting Constitutional Theory into Proportion: A Matter of Interpretation?, Oxford Journal of Legal Studies, 27 (1), 175-191. RAE2008

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Booth, Ken, Critical Security Studies and World Politics (Boulder, CO: Lynne Rienner Publishers, 2005), pp.ix+321 RAE2008