29 resultados para turbulence modelling theory

em Deakin Research Online - Australia


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Results are presented from a series of model studies of the transient exchange flow resulting from the steady descent of an impermeable barrier separating initially-quiescent fresh and saline water bodies having density ρ0 and ρ0 + (Δρ)0, respectively. A set of parametric laboratory experiments has been carried out (i) to determine the characteristic features of the time-dependent exchange flow over the barrier crest and (ii) to quantify the temporal increase in the thickness and spatial extent of the brackish water reservoir formed behind the barrier by the outflowing, partly-mixed saline water. The results of the laboratory experiments have been compared with the predictions of a theoretical model adapted from the steady, so-called maximal exchange flow case and good qualitative agreement between theory and experiment has been demonstrated. The comparisons indicate that head losses of between 7% and 3% are applicable to the flow over the ridge crest in the early and late stages, respectively, of the barrier descent phase, with these losses being attributed to mixing processes associated with the counterflowing layers of fresh and saline water in the vicinity of the ridge crest. The experimental data show (and the theoretical model predictions confirm) that (i) the dimensionless time of detection tdet (g′/Hb)1/2 of the brackish water pool fed by the dense outflow increases (at a given distance from the barrier) with increasing values of the descent rate parameter g'Hb/(dhb/dt)2 and (ii) the normalised thickness δ(x,t)/Hb of the pool at a given reference station increases monotonically with increasing values of the modified time (t - tdet)/(Hb/g′) 1/2, with the rate of thickening decreasing with increasing values of the descent rate parameter g'Hb (dhb/dt)2. Here, g′ = (g/ρ0) (Δρ)0 is the modified gravitational acceleration, Hb is the mean depth of the water and dhb/dt denotes the rate of descent of the barrier height hb with elapsed time t after the two water bodies are first brought into contact. © 2004 Kluwer Academic Publishers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Kanban Control Systems (KCS) have become a widely accepted form of inventory and production control. The creation of realistic Discrete Events Simulation (DES) models of KCS require specification of both information and material flow. There are several commercially available simulation packages that are able to model these systems although the use of an application specific modelling language provides means for rapid model development. A new Kanban specific simulation language as well as a high-speed execution engine is verified in this paper through the simulation of a single stage single part type production line. A single stage single part KCS is modelled with exhaustive enumeration of the decision variables of container sizes and number of Kanbans. Several performance measures were used; 95% Confidence Interval (CI) of container Flow Time (FT), mean line throughput as well as the Coefficient of Variance (CV) of FT and Cycle Time were used to determine the robustness of the control system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To be useful for policy simulation in the current climate of rapid structural change, inverse demand systems must remain regular over substanstial variations in quantities. The distance function is a convenient vehicle for generating such systems. It also allows convenient imposition of prior ideas about the structure of preferences required for realistic policy work. While the distance function directly yields Hicksian inverse demand functions via the Shepard-Hanoch lemma, they are usually explicit in the unobservable level of utility (u), but lack a closed-form representation in terms of the observable variables. Note however that the unobservability of u need not hinder estimation. A simple one-dimensional numerical inversion allows the estimation of the distance function via the parameters of the implied Marshallian inverse demand functions. This paper develops the formal theory for using distance functions in this context, and reports on initial trials on the operational feasibility of the method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper serves to specify and ground research into interfunctional integration in a wider theoretical context with particular reference to the interaction between technology and marketing in the biotechnology sphere. The general and specific problem areas are specified as those of interfunctional relations and the dyadic relationship between marketing and biotechnical managerial functions in particular. The contextual/organisational generative mechanisms that are likely to keep interfunctional relations at the centre of scholarly attention for some time are explored from the perspective of cybernetic theory. The law of requisite variety states that in an effective open system environmental variety is matched by internal structural variety. As organisations are faced with ever more turbulent, and complex environments, this must be matched by an increased internal complexity within the organisation. The two modes of response, namely holographic and mechanistic, both highlight the need to further our understanding of interfunctional differences. Having established the problem and its genesis, a specific research agenda is outlined as the exploration of the interfunctional differences from a decision-making perspective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents results from a survey of organizing forms in Australia's largest public companies between 2000 and 2004. The study sought to identify trends in forms of organizing and the extent to which the uptake of new forms led to a decrease in traditional forms of organizing. The analysis revealed changes across the organizational dimensions of structures, processes and boundaries. While Australian firms were clearly interested in exploring new forms of organizing, uptake was not universal, nor at the expense of traditional forms of organizing. An admixture of traditional and new, or dual, forms of organizing emerged as the preferred response to environmental turbulence. This paper employs and extends duality theory to explain the changes that occurred in Australian public companies over the four year period. Duality theory is operationalized in terms of five duality characteristics, which are employed to assess the composition and balance of traditional and new fOlms of organizing. The paper proposes that a dualities aware perspective offers a potential way forward in managing the balance between ostensibly contradictory forces of continuity and change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the theory of reasoned action, this study proposes a structural equation model that tests the relationships among carbon and environmental knowledge, attitude and behaviour. We found that carbon related knowledge is unrelated to attitudes, but general environmental attitudes drive both general and carbon related behaviours. The results suggest that specific environmental behaviour may therefore be more driven by general attitudes and knowledge, rather than by issue specific knowledge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, we estimate money demand functions for a panel of eight transitional economies, using quarterly data for the period 1995:01 1995 to 2005:03. We find that real M1 and real M2 and their determinants, namely real income and short-term domestic interest rate, are cointegrated, both for individual countries as well as for the panel. Long-run elasticities suggest that consistent with theory, real income positively and nominal interest rate negatively impact real money demand. Our test for panel Granger causality suggests short-run bidirectional causality between M1 and M2 and their determinants. Finally, our tests for stability of the money demand functions reveal more cases of unstable money demand functions when M2 is used as a proxy for money demand.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adolescents of low socio-economic position (SEP) are less likely than those of higher SEP to consume diets in line with current dietary recommendations. The reasons for these SEP variations remain poorly understood. We investigated the mechanisms underlying socio-economic variations in adolescents’ eating behaviours using a theoretically derived explanatory model. Data were obtained from a community-based sample of 2529 adolescents aged 12–15 years, from 37 secondary schools in Victoria, Australia. Adolescents completed a web-based survey assessing their eating behaviours, self-efficacy for healthy eating, perceived importance of nutrition and health, social modelling and support and the availability of foods in the home. Parents provided details of maternal education level, which was used as an indicator of SEP. All social cognitive constructs assessed mediated socio-economic variations in at least one indicator of adolescents’ diet. Cognitive factors were the strongest mediator of socio-economic variations in fruit intakes, while for energy-dense snack foods and fast foods, availability of energy-dense snacks at home tended to be strong mediators. Social cognitive theory provides a useful framework for understanding socio-economic variations in adolescent's diet and might guide public health programmes and policies focusing on improving adolescent nutrition among those experiencing socio-economic disadvantage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electrochemical synthesis of a tri-layer polypyrrole based actuator optimized for performance was reported. The 0.05 M pyrrole and 0.05 M tetrabutylammonium hexaflurophosphate in propylene carbonate (PC) yielded the optimum performance and stability. The force produced ranged from 0.2 to 0.4mN. Cyclic deflection tests on PC based actuators for 3 hours indicated that the displacement decreased by 60%. PC based actuator had a longer operating time, exceeding 3 hours, compared to acetonitrile based actuators. A triple-layer model of the polymer actuator was developed based on the classic bending beam theory by considering strain electrode material. A tri-layer actuator was fabricated [4, 6], by initially sputter coating a PVDF film with approximately 100nm of gold layer, resulting in a conductive film with a surface resistance of 8-10Ω. The PVDF film was about ~145µm thick had an approximate pore size of 45μm. A solution containing 0.05M distilled pyrrole monomer, 0.05M (TBAPF6) and 1% (w/w) distilled water in PC (propylene carbonate) solution was purged with nitrogen for 15 minutes. The continuity between PPy and PVDF. Results predicted by the model were in good agreement with the experimental data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Subjective Wellbeing (SWB) literature is replete with competing theories detailing the mechanisms underlying the construction and maintenance of SWB. The current study aimed to compare and contrast two of these approaches: multiple discrepancies theory (MDT) and an affective-cognitive theory of SWB. MDT posits SWB to be the result of perceived discrepancies between multiple standards of comparison. By contrast, affective-cognitive theory asserts that SWB is primarily influenced by trait affect, and indirectly influenced by personality and cognition through trait affect. Participants comprised 387 individuals who responded to the 5th longitudinal survey of the Australian Unity Wellbeing Index. Results of Structural Equation Modelling (SEM) indicated the poorest fit to the data for the MDT model. The affective-cognitive model also did not provide a good fit to the data. A purely affective model provided the best fit to the data, was the most parsimonious, and explained 66% of variance in SWB.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Commission on Graduate Education in Economics had raised several concerns regarding the role of mathematics in graduate training in economics (Krueger, 1991; Colander, 1998, 2005). This paper undertakes a detailed scrutiny of the notion of a utility function to motivate and describe the common patterns across mathematical concepts and results that are used by economists. In the process one arrives at a classification of mathematical terms which is used to state mathematical results in economics. The usefulness of the classification scheme is illustrated with the help of a discussion of Arrow's impossibility theorem. Common knowledge of the patterns in mathematical concepts and results could be effective in enhancing communication between students, teachers and researchers specializing in different sub-fields of economics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Any attempt to model an economy requires foundational assumptions about the relations between prices, values and the distribution of wealth. These assumptions exert a profound influence over the results of any model. Unfortunately, there are few areas in economics as vexed as the theory of value. I argue in this paper that the fundamental problem with past theories of value is that it is simply not possible to model the determination of value, the formation of prices and the distribution of income in a real economy with analytic mathematical models. All such attempts leave out crucial processes or make unrealistic assumptions which significantly affect the results. There have been two primary approaches to the theory of value. The first, associated with classical economists such as Ricardo and Marx were substance theories of value, which view value as a substance inherent in an object and which is conserved in exchange. For Marxists, the value of a commodity derives solely from the value of the labour power used to produce it - and therefore any profit is due to the exploitation of the workers. The labour theory of value has been discredited because of its assumption that labour was the only ‘factor’ that contributed to the creation of value, and because of its fundamentally circular argument. Neoclassical theorists argued that price was identical with value and was determined purely by the interaction of supply and demand. Value then, was completely subjective. Returns to labour (wages) and capital (profits) were determined solely by their marginal contribution to production, so that each factor received its just reward by definition. Problems with the neoclassical approach include assumptions concerning representative agents, perfect competition, perfect and costless information and contract enforcement, complete markets for credit and risk, aggregate production functions and infinite, smooth substitution between factors, distribution according to marginal products, firms always on the production possibility frontier and firms’ pricing decisions, ignoring money and credit, and perfectly rational agents with infinite computational capacity. Two critical areas include firstly, the underappreciated Sonnenschein-Mantel- Debreu results which showed that the foundational assumptions of the Walrasian general-equilibrium model imply arbitrary excess demand functions and therefore arbitrary equilibrium price sets. Secondly, in real economies, there is no equilibrium, only continuous change. Equilibrium is never reached because of constant changes in preferences and tastes; technological and organisational innovations; discoveries of new resources and new markets; inaccurate and evolving expectations of businesses, consumers, governments and speculators; changing demand for credit; the entry and exit of firms; the birth, learning, and death of citizens; changes in laws and government policies; imperfect information; generalized increasing returns to scale; random acts of impulse; weather and climate events; changes in disease patterns, and so on. The problem is not the use of mathematical modelling, but the kind of mathematical modelling used. Agent-based models (ABMs), objectoriented programming and greatly increased computer power however, are opening up a new frontier. Here a dynamic bargaining ABM is outlined as a basis for an alternative theory of value. A large but finite number of heterogeneous commodities and agents with differing degrees of market power are set in a spatial network. Returns to buyers and sellers are decided at each step in the value chain, and in each factor market, through the process of bargaining. Market power and its potential abuse against the poor and vulnerable are fundamental to how the bargaining dynamics play out. Ethics therefore lie at the very heart of economic analysis, the determination of prices and the distribution of wealth. The neoclassicals are right then that price is the enumeration of value at a particular time and place, but wrong to downplay the critical roles of bargaining, power and ethics in determining those same prices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The relational aspects for critical infrastructure systems are not readily quantifiable as there are numerous variability’s and system dynamics that lack uniformity and are difficult to quantify. Notwithstanding this, there is a large body of existing research that is founded in the area of quantitative analysis of critical infrastructure networks, their system relationships and the resilience of these networks. However, the focus of this research is to investigate the aspect of taking a different, more generalised and holistic system perspective approach. This is to suggest that that through applying network theory and taking a ‘soft’ system-like modelling approach that this offers an alternative approach to viewing and modelling critical infrastructure system relational aspects that warrants further enquiry.