905 resultados para Two-point boundary value problems


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Home Economics Classrooms as Part of Developing the Environment Housing Activities and Curriculums Defining Change --- The aim of the research project was to develop home economics classrooms to be flexible and versatile learning environments where household activities might be practiced according to the curriculum in different social networking situations. The research is based on the socio-cultural approach, where the functionality of the learning environment is studied specifically from an interactive learning viewpoint. The social framework is a natural starting point in home economics teaching because of the group work in classrooms. The social nature of learning thus becomes a significant part of the learning process. The study considers learning as experience based, holistic and context bound. The learning environment, i.e. home economics classrooms and the material tools there, plays a significant role in developing students skills to manage everyday life. --- The first research task was to analyze the historical development of household activities. The second research task was to develop and test criteria for functional home economics classrooms in planning both the learning environment and the students activities during lessons. The third research task was to evaluate how different professionals (commissioners, planners and teachers) use the criteria as a tool. The research consists of three parts. The first contains a historical analysis of how social changes have created tension between traditional household classrooms and new activities in homes. The historical analysis is based on housing research, regulations and instructions. For this purpose a new theoretical concept, the tension arch, was introduced. This helped in recognizing and solving problems in students activities and in developing innovations. The functionality criteria for home economics classrooms were developed based on this concept. These include technical (health, safety and technical factors), functional (ergonomic, ecological, aesthetic and economic factors) and behavioural (cooperation and interaction skills and communication technologies) criteria. --- The second part discusses how the criteria were used in renovating school buildings. Empirical data was collected from two separate schools where the activities during lessons were recorded both before and after classrooms were renovated. An analysis of both environments based on video recordings was conducted. The previously created criteria were made use of, and problematic points in functionality looked for particularly from a social interactive viewpoint. The results show that the criteria were used as a planning tool. The criteria facilitated layout and equipment solutions that support both curriculum and learning in home economics classrooms taking into consideration cooperation and interaction in the classroom. With the help of the criteria the home economics classrooms changed from closed and complicated space into integrated and open spaces where the flexibility and versatility of the learning environment was emphasized. The teacher became a facilitator and counselor instead a classroom controller. --- The third part analyses the discussions in planning meetings. These were recorded and an analysis was conducted of how the criteria and research results were used in the planning process of new home economics classrooms. The planning process was multivoiced, i.e. actors from different interest groups took part. All the previously created criteria (technical, functional and behavioural) emerged in the discussions and some of them were used as planning tools. Planning meetings turned into planning studios where boundaries between organizations were ignored and the physical learning environments were developed together with experts. The planning studios resulted in multivoiced planning which showed characteristics of collaborative and participating planning as well as producing common knowledge and shared expertise. --- KEY WORDS: physical learning environment, socio-cultural approach, tension arch, boundary crossing, collaborative planning.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The effect of surface mass transfer velocities having normal, principal and transverse direction components (�vectored� suction and injection) on the steady, laminar, compressible boundary layer at a three-dimensional stagnation point has been investigated both for nodal and saddle points of attachment. The similarity solutions of the boundary layer equations were obtained numerically by the method of parametric differentiation. The principal and transverse direction surface mass transfer velocities significantly affect the skin friction (both in the principal and transverse directions) and the heat transfer. Also the inadequacy of assuming a linear viscosity-temperature relation at low-wall temperatures is shown.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A new formulation is suggested for the fixed end-point regulator problem, which, in conjunction with the recently developed integration-free algorithms, provides an efficient means of obtaining numerical solutions to such problems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The long-wavelength hydrodynamics of the Renn-Lubensky twist grain boundary phase with grain boundary angle 2pialpha, alpha irrational, is studied. We find three propagating sound modes, with two of the three sound speeds vanishing for propagation orthogonal to the grains, and one vanishing for propagation parallel to the grains as well. In addition, we find that the viscosities eta1, eta2, eta4, and eta5 diverge like 1/Absolute value of omega as frequency omega --> 0, with the divergent parts DELTAeta(i) satisfying DELTAeta1DELTAeta4=(DELTAeta5)2, exactly. Our results should also apply to the predicted decoupled lamellar phase.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Load-deflection curves for a notched beam under three-point load are determined using the Fictitious Crack Model (FCM) and Blunt Crack Model (BCM). Two values of fracture energy GF are used in this analysis: (i) GF obtained from the size effect law and (ii) GF obtained independently of the size effect. The predicted load-deflection diagrams are compared with the experimental ones obtained for the beams tested by Jenq and Shah. In addition, the values of maximum load (Pmax) obtained by the analyses are compared with the experimental ones for beams tested by Jenq and Shah and by Bažant and Pfeiffer. The results indicate that the descending portion of the load-deflection curve is very sensitive to the GF value used.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Our study concerns an important current problem, that of diffusion of information in social networks. This problem has received significant attention from the Internet research community in the recent times, driven by many potential applications such as viral marketing and sales promotions. In this paper, we focus on the target set selection problem, which involves discovering a small subset of influential players in a given social network, to perform a certain task of information diffusion. The target set selection problem manifests in two forms: 1) top-k nodes problem and 2) lambda-coverage problem. In the top-k nodes problem, we are required to find a set of k key nodes that would maximize the number of nodes being influenced in the network. The lambda-coverage problem is concerned with finding a set of k key nodes having minimal size that can influence a given percentage lambda of the nodes in the entire network. We propose a new way of solving these problems using the concept of Shapley value which is a well known solution concept in cooperative game theory. Our approach leads to algorithms which we call the ShaPley value-based Influential Nodes (SPINs) algorithms for solving the top-k nodes problem and the lambda-coverage problem. We compare the performance of the proposed SPIN algorithms with well known algorithms in the literature. Through extensive experimentation on four synthetically generated random graphs and six real-world data sets (Celegans, Jazz, NIPS coauthorship data set, Netscience data set, High-Energy Physics data set, and Political Books data set), we show that the proposed SPIN approach is more powerful and computationally efficient. Note to Practitioners-In recent times, social networks have received a high level of attention due to their proven ability in improving the performance of web search, recommendations in collaborative filtering systems, spreading a technology in the market using viral marketing techniques, etc. It is well known that the interpersonal relationships (or ties or links) between individuals cause change or improvement in the social system because the decisions made by individuals are influenced heavily by the behavior of their neighbors. An interesting and key problem in social networks is to discover the most influential nodes in the social network which can influence other nodes in the social network in a strong and deep way. This problem is called the target set selection problem and has two variants: 1) the top-k nodes problem, where we are required to identify a set of k influential nodes that maximize the number of nodes being influenced in the network and 2) the lambda-coverage problem which involves finding a set of influential nodes having minimum size that can influence a given percentage lambda of the nodes in the entire network. There are many existing algorithms in the literature for solving these problems. In this paper, we propose a new algorithm which is based on a novel interpretation of information diffusion in a social network as a cooperative game. Using this analogy, we develop an algorithm based on the Shapley value of the underlying cooperative game. The proposed algorithm outperforms the existing algorithms in terms of generality or computational complexity or both. Our results are validated through extensive experimentation on both synthetically generated and real-world data sets.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We present an explicit solution of the problem of two coupled spin-1/2 impurities, interacting with a band of conduction electrons. We obtain an exact effective bosonized Hamiltonian, which is then treated by two different methods (low-energy theory and mean-field approach). Scale invariance is explicitly shown at the quantum critical point. The staggered susceptibility behaves like ln(T(K)/T) at low T, whereas the magnetic susceptibility and [S1.S2] are well behaved at the transition. The divergence of C(T)/T when approaching the transition point is also studied. The non-Fermi-liquid (actually marginal-Fermi-liquid) critical point is shown to arise because of the existence of anomalous correlations, which lead to degeneracies between bosonic and fermionic states of the system. The methods developed in this paper are of interest for studying more physically relevant models, for instance, for high-T(c) cuprates.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A hybrid technique to model two dimensional fracture problems which makes use of displacement discontinuity and direct boundary element method is presented. Direct boundary element method is used to model the finite domain of the body, while displacement discontinuity elements are utilized to represent the cracks. Thus the advantages of the component methods are effectively combined. This method has been implemented in a computer program and numerical results which show the accuracy of the present method are presented. The cases of bodies containing edge cracks as well as multiple cracks are considered. A direct method and an iterative technique are described. The present hybrid method is most suitable for modeling problems invoking crack propagation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Indian logic has a long history. It somewhat covers the domains of two of the six schools (darsanas) of Indian philosophy, namely, Nyaya and Vaisesika. The generally accepted definition of Indian logic over the ages is the science which ascertains valid knowledge either by means of six senses or by means of the five members of the syllogism. In other words, perception and inference constitute the subject matter of logic. The science of logic evolved in India through three ages: the ancient, the medieval and the modern, spanning almost thirty centuries. Advances in Computer Science, in particular, in Artificial Intelligence have got researchers in these areas interested in the basic problems of language, logic and cognition in the past three decades. In the 1980s, Artificial Intelligence has evolved into knowledge-based and intelligent system design, and the knowledge base and inference engine have become standard subsystems of an intelligent system. One of the important issues in the design of such systems is knowledge acquisition from humans who are experts in a branch of learning (such as medicine or law) and transferring that knowledge to a computing system. The second important issue in such systems is the validation of the knowledge base of the system i.e. ensuring that the knowledge is complete and consistent. It is in this context that comparative study of Indian logic with recent theories of logic, language and knowledge engineering will help the computer scientist understand the deeper implications of the terms and concepts he is currently using and attempting to develop.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Steady two-dimensional and axisymmetric compressible nonsimilar laminar boundary-layer flows with non-uniform slot injection (or suction) and non-uniform wall enthalpy have been studied from the starting point of the streamwise co-ordinate to the exact point of separation. The effect of different free stream Mach number has also been considered. The finite discontinuities arising at the leading and trailing edges of the slot for the uniform slot injection (suction) or wall enthalpy are removed by choosing appropriate non-uniform slot injection (suction) or wall enthalpy. The difficulties arising at the starting point of the streamwise co-ordinate, at the edges of the slot and at the point of separation are overcome by applying the method of quasilinear implicit finite difference scheme with an appropriate selection of finer step size along the streamwise direction. It is observed that the non-uniform slot injection moves the point of separation downstream but the non-uniform slot suction has the reverse effect. The increase of Mach number shifts the point of separation upstream due to the adverse pressure gradient. The increase of total enthalpy at the wall causes the separation to occur earlier while cooling delays it. The non-uniform total enthalpy at the wall (i.e., the cooling or heating of the wall in a slot) along the streamwise co-ordinate has very little effect on the skin friction and thus on the point of separation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The influence of temperature-dependent viscosity and Prandtl number on the unsteady laminar nonsimilar forced convection flow over two-dimensional and axisymmetric bodies has been examined where the unsteadiness and (or) nonsimilarity are (is) due to the free stream velocity, mass transfer, and transverse curvature. The partial differential equations governing the flow which involve three independent variables have been solved numerically using an implicit finite-difference scheme along with a quasilinearization technique. It is found that both the skin friction and heat transfer strongly respond to the unsteady free stream velocity distributions. The unsteadiness and injection cause the location of zero skin friction to move upstream. However, the effect of variable viscosity and Prandtl number is to move it downstream. The heat transfer is found to depend strongly on viscous dissipation, but the skin friction is little affected by it. In general, the results pertaining to variable fluid properties differ significantly, from those of constant fluid properties.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An attempt has been made here to study the sensitivity of the mean and the turbulence structure of the monsoon trough boundary layer to the choice of the constants in the dissipation equation for two stations Delhi and Calcutta, using one-dimensional atmospheric boundary layer model with e-epsilon turbulence closure. An analytical discussion of the problems associated with the constants of the dissipation equation is presented. It is shown here that the choice of the constants in the dissipation equation is quite crucial and the turbulence structure is very sensitive to these constants. The modification of the dissipation equation adopted by earlier studies, that is, approximating the Tke generation (due to shear and buoyancy production) in the epsilon-equation by max (shear production, shear + buoyancy production), can be avoided by a suitable choice of the constants suggested here. The observed turbulence structure is better simulated with these constants. The turbulence structure simulation with the constants recommended by Aupoix et al (1989) (which are interactive in time) for the monsoon region is shown to be qualitatively similar to the simulation obtained with the constants suggested here, thus implying that no universal constants exist to regulate dissipation rate. Simulations of the mean structure show little sensitivity to the type of the closure parameterization between e-l and e-epsilon closures. However the turbulence structure simulation with e-epsilon closure is far better compared to the e-l model simulations. The model simulations of temperature profiles compare quite well with the observations whenever the boundary layer is well mixed (neutral) or unstable. However the models are not able to simulate the nocturnal boundary layer (stable) temperature profiles. Moisture profiles are simulated reasonably better. With one-dimensional models, capturing observed wind variations is not up to the mark.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An important tool in signal processing is the use of eigenvalue and singular value decompositions for extracting information from time-series/sensor array data. These tools are used in the so-called subspace methods that underlie solutions to the harmonic retrieval problem in time series and the directions-of-arrival (DOA) estimation problem in array processing. The subspace methods require the knowledge of eigenvectors of the underlying covariance matrix to estimate the parameters of interest. Eigenstructure estimation in signal processing has two important classes: (i) estimating the eigenstructure of the given covariance matrix and (ii) updating the eigenstructure estimates given the current estimate and new data. In this paper, we survey some algorithms for both these classes useful for harmonic retrieval and DOA estimation problems. We begin by surveying key results in the literature and then describe, in some detail, energy function minimization approaches that underlie a class of feedback neural networks. Our approaches estimate some or all of the eigenvectors corresponding to the repeated minimum eigenvalue and also multiple orthogonal eigenvectors corresponding to the ordered eigenvalues of the covariance matrix. Our presentation includes some supporting analysis and simulation results. We may point out here that eigensubspace estimation is a vast area and all aspects of this cannot be fully covered in a single paper. (C) 1995 Academic Press, Inc.