970 resultados para Practical problems


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Science and the Scientist's Social Responsibility. Joseph Ben-David's, Roger Sperry's and Knut Erik Tranøy's Views of Science and the Scientist's Social Responsibility The aim of the study was to investigate, whether or not there is any connection between Jewish sociologist Joseph Ben-David's, American neuroscientist Roger Sperry's and Norwegian philosopher Knut Erik Tranøy's views of science and views of the scientist's social responsibility. The sources of information were their writings concerning this topic. Ben-David has a classical view of science. He thinks that the Mertonian norms of scientific activity, first written in 1942, are still valid in modern science. With the help of these norms Ben-David defends the view that science is morally neutral. Ben-David thinks that a scientist has a limited social responsibility. A scientist only reports on the new results, but he is not responsible for applying the results. In any case Ben-David's ideas are no longer valid. Sperry has a scientistic view of science. According to Sperry, science is the source of moral norms and also the best guide for moral action. The methods of natural sciences "show" how to solve moral problems. A scientist's personal views of science and social responsibility are not important. However Sperry's view is very problematic on the ethical side. Tranøy stresses the scientist's social responsibility. A scientist has common norms with the society from with he or she comes. This is why a scientist has the right, and also the responsibility, to discuss social and ethical questions between science and society. Tranøy's view has some ethical and practical problems, but it is valid in principle. Finally, Ben-David's, Sperry's and Tranøy's views of both science and the scientist's social responsibility have a connection: the view of science corresponds to the certain view of scientist's social responsibility. The result of this study is: Ben-David's, Sperry's and Tranøy's view of science have an ethical starting point as its fundamental presupposition, which include certain views of scientific knowledge, good and the scientist's ethical responsibilities. The connection between Ben-David's, Sperry's and Tranøy's views of science and views of the scientist's social responsibility means that their views of epistemology, meta-ethics and the scientist's ethical responsibilities have a connection to their views of the scientist's social responsibility. The results of this study can help the scientific community to organize the social responsibility of a scientist and deepen the conversation concerning the scientist's social responsibility.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A numerical method is suggested for separation of stresses in photo-orthotropic elasticity using the numerical solution of compatibility equation for orthotropic case. The compatibility equation is written in terms of a stress parameter S analogous to the sum of principal stresses in two-dimensional isotropic case. The solution of this equation provides a relation between the normal stresses. The photoelastic data give the shear stress and another relation between the two normal stresses. The accuracy of the numerical method and its application to practical problems are illustrated with examples.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Women and Marital Breakdown in South India: Reconstructing Homes, Bonds and Persons is an ethnographic analysis of the situation of divorced and separated women and their families in the South Indian city of Bangalore. The study is based on 16 months of anthropological fieldwork, i.e., participant observation and life history interviews among 50 divorced and separated women from different socio-religious backgrounds in their homes, in the women s organisations and in the Family Court. The study follows the divorced and separated women from their natal homes to their affinal homes through homelessness and legal battles to their reconstructed natal, affinal or single homes in order to find out what it means to be a person within hierarchical gender and kinship relations in South India. Marital breakdown impacts on kin relations and discloses the existing gender relations and power structure through its consequences. It makes the transformability of relational personhood as well as the transformability of relational society and culture visible. Although the study reveals the painful history of women s ill-treatment in marriage, family and kinship systems, it also demonstrates the women s rejection of the domination; and shows their ability to re-negotiate and promote changes not only to their own positions but to the whole hierarchical system as well. The study explores the divorced and separated women s manifold dilemmas, complicated legal battles, and endless arrangements when they have to struggle with the very practical problems of supporting themselves financially, finding and making a new home for themselves, and re-arranging relationships with their kin and friends. As marital breakdown fundamentally transforms the women s relational field, it forces them to recreate substitutive relations in a flexible way and, simultaneously, to re-construct themselves and their lives without a ready or positive cultural or behavioural template. This process reveals the agency of the divorced and separated women as well as shedding light on issues of gender and the cultural construction of the person in South India. This topical study explores the previously neglected subject of marital breakdown in India and shows the new meaning of kinship in South India.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Today finite element method is a well established tool in engineering analysis and design. Though there axe many two and three dimensional finite elements available, it is rare that a single element performs satisfactorily in majority of practical problems. The present work deals with the development of 4-node quadrilateral element using extended Lagrange interpolation functions. The classical univariate Lagrange interpolation is well developed for 1-D and is used for obtaining shape functions. We propose a new approach to extend the Lagrange interpolation to several variables. When variables axe more than one the method also gives the set of feasible bubble functions. We use the two to generate shape function for the 4-node arbitrary quadrilateral. It will require the incorporation of the condition of rigid body motion, constant strain and Navier equation by imposing necessary constraints. The procedure obviates the need for isoparametric transformation since interpolation functions are generated for arbitrary quadrilateral shapes. While generating the element stiffness matrix, integration can be carried out to the accuracy desired by dividing the quadrilateral into triangles. To validate the performance of the element which we call EXLQUAD4, we conduct several pathological tests available in the literature. EXLQUAD4 predicts both stresses and displacements accurately at every point in the element in all the constant stress fields. In tests involving higher order stress fields the element is assured to converge in the limit of discretisation. A method thus becomes available to generate shape functions directly for arbitrary quadrilateral. The method is applicable also for hexahedra. The approach should find use for development of finite elements for use with other field equations also.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The advent of large and fast digital computers and development of numerical techniques suited to these have made it possible to review the analysis of important fundamental and practical problems and phenomena of engineering which have remained intractable for a long time. The understanding of the load transfer between pin and plate is one such. Inspite of continuous attack on these problems for over half a century, classical solutions have remained limited in their approach and value to the understanding of the phenomena and the generation of design data. On the other hand, the finite element methods that have grown simultaneously with the recent development of computers have been helpful in analysing specific problems and answering specific questions, but are yet to be harnessed to assist in obtaining with economy a clearer understanding of the phenomena of partial separation and contact, friction and slip, and fretting and fatigue in pin joints. Against this background, it is useful to explore the application of the classical simple differential equation methods with the aid of computer power to open up this very important area. In this paper we describe some of the recent and current work at the Indian Institute of Science in this last direction.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Modern sample surveys started to spread after statistician at the U.S. Bureau of the Census in the 1940s had developed a sampling design for the Current Population Survey (CPS). A significant factor was also that digital computers became available for statisticians. In the beginning of 1950s, the theory was documented in textbooks on survey sampling. This thesis is about the development of the statistical inference for sample surveys. For the first time the idea of statistical inference was enunciated by a French scientist, P. S. Laplace. In 1781, he published a plan for a partial investigation in which he determined the sample size needed to reach the desired accuracy in estimation. The plan was based on Laplace s Principle of Inverse Probability and on his derivation of the Central Limit Theorem. They were published in a memoir in 1774 which is one of the origins of statistical inference. Laplace s inference model was based on Bernoulli trials and binominal probabilities. He assumed that populations were changing constantly. It was depicted by assuming a priori distributions for parameters. Laplace s inference model dominated statistical thinking for a century. Sample selection in Laplace s investigations was purposive. In 1894 in the International Statistical Institute meeting, Norwegian Anders Kiaer presented the idea of the Representative Method to draw samples. Its idea was that the sample would be a miniature of the population. It is still prevailing. The virtues of random sampling were known but practical problems of sample selection and data collection hindered its use. Arhtur Bowley realized the potentials of Kiaer s method and in the beginning of the 20th century carried out several surveys in the UK. He also developed the theory of statistical inference for finite populations. It was based on Laplace s inference model. R. A. Fisher contributions in the 1920 s constitute a watershed in the statistical science He revolutionized the theory of statistics. In addition, he introduced a new statistical inference model which is still the prevailing paradigm. The essential idea is to draw repeatedly samples from the same population and the assumption that population parameters are constants. Fisher s theory did not include a priori probabilities. Jerzy Neyman adopted Fisher s inference model and applied it to finite populations with the difference that Neyman s inference model does not include any assumptions of the distributions of the study variables. Applying Fisher s fiducial argument he developed the theory for confidence intervals. Neyman s last contribution to survey sampling presented a theory for double sampling. This gave the central idea for statisticians at the U.S. Census Bureau to develop the complex survey design for the CPS. Important criterion was to have a method in which the costs of data collection were acceptable, and which provided approximately equal interviewer workloads, besides sufficient accuracy in estimation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In recent times computational algorithms inspired by biological processes and evolution are gaining much popularity for solving science and engineering problems. These algorithms are broadly classified into evolutionary computation and swarm intelligence algorithms, which are derived based on the analogy of natural evolution and biological activities. These include genetic algorithms, genetic programming, differential evolution, particle swarm optimization, ant colony optimization, artificial neural networks, etc. The algorithms being random-search techniques, use some heuristics to guide the search towards optimal solution and speed-up the convergence to obtain the global optimal solutions. The bio-inspired methods have several attractive features and advantages compared to conventional optimization solvers. They also facilitate the advantage of simulation and optimization environment simultaneously to solve hard-to-define (in simple expressions), real-world problems. These biologically inspired methods have provided novel ways of problem-solving for practical problems in traffic routing, networking, games, industry, robotics, economics, mechanical, chemical, electrical, civil, water resources and others fields. This article discusses the key features and development of bio-inspired computational algorithms, and their scope for application in science and engineering fields.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Given the significant gains that relay-based cooperation promises, the practical problems of acquisition of channel state information (CSI) and the characterization and optimization of performance with imperfect CSI are receiving increasing attention. We develop novel and accurate expressions for the symbol error probability (SEP) for fixed-gain amplify-and-forward relaying when the destination acquires CSI using the time-efficient cascaded channel estimation (CCE) protocol. The CCE protocol saves time by making the destination directly estimate the product of the source-relay and relay-destination channel gains. For a single relay system, we first develop a novel SEP expression and a tight SEP upper bound. We then similarly analyze an opportunistic multi-relay system, in which both selection and coherent demodulation use imperfect estimates. A distinctive aspect of our approach is the use of as few simplifying approximations as possible, which results in new results that are accurate at signal-to-noise-ratios as low as 1 dB for single and multi-relay systems. Using insights gleaned from an asymptotic analysis, we also present a simple, closed-form, nearly-optimal solution for allocation of energy between pilot and data symbols at the source and relay(s).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The miniaturization of electronic and ionic devices with thermionic cathodes and thc improvement of their vacuum properties are questions of very great interest to the electronic engineer. However there have bcen no proposals so far to analyse the problem of miniaturization of such devices In a fundamental way. The present work suggests a choice of the geometrical shape of the cathode, the anode and the envelope of the device, that may help towards such a fundamcnlal approach.It is shown that a design, in which the cathode and the envelope of the tube are made of thm prismatic shape and the anode coincides with the cnvclope, offers a slriknrg advantage over the conventional cylindrical design, in respect of over-all size. The use of the prismatic shape will lead to considerable economy in msterials and may facilitate simpler prodoct~ont echn~ques. I n respect of the miin criteria of vacuum, namely the grade of vacuum, the internal volume occupied by residual gases, the evolution of gases in the internal space and the diffusion of gases from outside into the devicc, it is shown that the prismatic form is at least as good as, if not somewhat superior lo, the cylindrical form.In the actual construction of thin prismatic tubes, manv practical problems will arise, the most important being the mechanical strength and stablity of the structure. But the changeover from the conventional cylindrical to the new prirmaiic form, with its basic advantages, is a development that merits close attention.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, a pressure correction algorithm for computing incompressible flows is modified and implemented on unstructured Chimera grid. Schwarz method is used to couple the solutions of different sub-domains. A new interpolation to ensure consistency between primary variables and auxiliary variables is proposed. Other important issues such as global mass conservation and order of accuracy in the interpolations are also discussed. Two numerical simulations are successfully performed. They include one steady case, the lid-driven cavity and one unsteady case, the flow around a circular cylinder. The results demonstrate a very good performance of the proposed scheme on unstructured Chimera grids. It prevents the decoupling of pressure field in the overlapping region and requires only little modification to the existing unstructured Navier–Stokes (NS) solver. The numerical experiments show the reliability and potential of this method in applying to practical problems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Modelling free-surface flow has very important applications in many engineering areas such as oil transportation and offshore structures. Current research focuses on the modelling of free surface flow in a tank by solving the Navier-Stokes equation. An unstructured finite volume method is used to discretize the governing equations. The free surface is tracked by dynamically adapting the mesh and making it always surface conforming. A mesh-smoothing scheme based on the spring analogy is also implemented to ensure mesh quality throughout the computaiton. Studies are performed on the sloshing response of a liquid in an elastic container subjected to various excitation frequencies. Further investigations are also carried out on the critical frequency that leads to large deformation of the tank walls. Another numerical simulation involves the free-surface flow past as submerged obstacle placed in the tank to show the flow separation and vortices. All these cases demonstrate the capability of this numerical method in modelling complicated practical problems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

For simulating multi-scale complex flow fields like turbulent flows, the high order accurate schemes are preferred. In this paper, a scheme construction with numerical flux residual correction (NFRC) is presented. Any order accurate difference approximation can be obtained with the NFRC. To improve the resolution of the shock, the constructed schemes are modified with group velocity control (GVC) and weighted group velocity control (WGVC). The method of scheme construction is simple, and it is used to solve practical problems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Resumen: En este corto tiempo de desarrollo de la Bioética se han ido proponiendo nuevas metas. El avance ha ido muy rápido y los procedimientos esquemáticos de los primeros años han sido ampliamente superados y complejizados. Pero, cada uno de ellos, nos ofrece una perspectiva distinta a tener en cuenta, que, si tenemos la apertura necesaria, contribuyen a descubrirnos algo más la realidad. El objetivo de la deliberación no es la toma de decisiones ciertas o exclusivas, sino prudentes. Distintas personas pueden tomar distintas decisiones ante un mismo hecho y todas ser prudentes. Esto plantea un desafío a futuro: el asumir un tipo de racionalidad que permita la participación de todos los implicados en el proceso de deliberación de los problemas prácticos, en nuestro caso los morales. Los argumentos que se esgrimen pueden no anular completamente otras perspectivas y, por tanto, otros argumentos sobre el mismo asunto o problema. De ahí que necesitamos las perspectivas y razones de los demás. Con esto, resulta que los otros se convierten en condición de posibilidad de mi propio desarrollo como ser racional.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This article outlines the outcome of work that set out to provide one of the specified integral contributions to the overarching objectives of the EU- sponsored LIFE98 project described in this volume. Among others, these included a requirement to marry automatic monitoring and dynamic modelling approaches in the interests of securing better management of water quality in lakes and reservoirs. The particular task given to us was to devise the elements of an active management strategy for the Queen Elizabeth II Reservoir. This is one of the larger reservoirs supplying the population of the London area: after purification and disinfection, its water goes directly to the distribution network and to the consumers. The quality of the water in the reservoir is of primary concern, for the greater is the content of biogenic materials, including phytoplankton, then the more prolonged is the purification and the more expensive is the treatment. Whatever good that phytoplankton may do by way of oxygenation and oxidative purification, it is eventually relegated to an impurity that has to be removed from the final product. Indeed, it has been estimated that the cost of removing algae and microorganisms from water represents about one quarter of its price at the tap. In chemically fertile waters, such as those typifying the resources of the Thames Valley, there is thus a powerful and ongoing incentive to be able to minimise plankton growth in storage reservoirs. Indeed, the Thames Water company and its predecessor undertakings, have a long and impressive history of confronting and quantifying the fundamentals of phytoplankton growth in their reservoirs and of developing strategies for operation and design to combat them. The work to be described here follows in this tradition. However, the use of the model PROTECH-D to investigate present phytoplankton growth patterns in the Queen Elizabeth II Reservoir questioned the interpretation of some of the recent observations. On the other hand, it has reinforced the theories underpinning the original design of this and those Thames-Valley storage reservoirs constructed subsequently. The authors recount these experiences as an example of how simulation models can hone the theoretical base and its application to the practical problems of supplying water of good quality at economic cost, before the engineering is initiated.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

When salmonid redds are disrupted by spates, the displaced eggs will drift downstream. The mean distance of travel, the types of locations in which the eggs resettle and the depth of reburial of displaced eggs are not known. Investigation of these topics under field conditions presents considerable practical problems, though the use of artificial eggs might help to overcome some of them. Attempts to assess the similarities and/or differences in performance between real and artificial eggs are essential before artificial eggs can validly be used to simulate real eggs. The present report first compares the two types of egg in terms of their measurable physical characteristics (e.g. dimensions and density). The rate at which eggs fall in still water will relate to the rate at which they are likely to resettle in flowing water in the field. As the rate of fall will be influenced by a number of additional factors (e.g. shape and surface texture) which are not easily measured directly, the rates of fall of the two types of egg have been compared directly under controlled conditions. Finally, comparisons of the pattern of settlement of the two types of egg in flowing water in an experimental channel have been made. Although the work was primarily aimed at testing the value of artificial eggs as a simulation of real eggs, several side issues more directly concerned with the properties of real eggs and the likely distance of drift in natural streams have also been explored. This is the first of three reports made on this topic by the author in 1984.