960 resultados para Constructivist theory
Resumo:
A persistent question in the development of models for macroeconomic policy analysis has been the relative role of economic theory and evidence in their construction. This paper looks at some popular strategies that involve setting up a theoretical or conceptual model (CM) which is transformed to match the data and then made operational for policy analysis. A dynamic general equilibrium model is constructed that is similar to standard CMs. After calibration to UK data it is used to examine the utility of formal econometric methods in assessing the match of the CM to the data and also to evaluate some standard model-building strategies. Keywords: Policy oriented economic modeling; Model evaluation; VAR models
Resumo:
Differential axial shortening in vertical members of reinforced concrete high-rise buildings occurs due to shrinkage, creep and elastic shortening, which are time dependent effects of concrete. This has to be quantified in order to make adequate provisions and mitigate its adverse effects. This paper presents a novel procedure for quantifying the axial shortening of vertical members using the variations in vibration characteristics of the structure, in lieu of using gauges which can pose problems in use during and after the construction. This procedure is based on the changes in the modal flexiblity matrix which is expressed as a function of the mode shapes and the reciprocal of the natural frequencies. This paper will present the development of this novel procedure.
Resumo:
This paper explains, somewhat along a Simmelian line, that political theory may produce practical and universal theories like those developed in theoretical physics. The reasoning behind this paper is to show that the Element of Democracy Theory may be true by way of comparing it to Einstein’s Special Relativity – specifically concerning the parameters of symmetry, unification, simplicity, and utility. These parameters are what make a theory in physics as meeting them not only fits with current knowledge, but also produces paths towards testing (application). As the Element of Democracy Theory meets these same parameters, it could settle the debate concerning the definition of democracy. This will be shown firstly by discussing why no one has yet achieved a universal definition of democracy; secondly by explaining the parameters chosen (as in why these and not others confirm or scuttle theories); and thirdly by comparing how Special Relativity and the Element of Democracy match the parameters.
Resumo:
The thrust towards constructivist learning and critical thinking in the National Curricular Framework (2005) of India implies shifts in pedagogical practices. In this context, drawing on grounded theory, focus group interviews were conducted with 40 preservice teachers to ascertain the contextual situation and the likely outcomes of applying critical literacy across the curriculum. Central themes that emerged in the discussion were: being teacher centred/ learner centred, and conformity/autonomy in teaching and learning. The paper argues that within the present Indian context, while there is scope for changes to pedagogy and learning styles, yet these must be adequately contextualised.
Resumo:
Theory-of-Mind has been defined as the ability to explain and predict human behaviour by imputing mental states, such as attention, intention, desire, emotion, perception and belief, to the self and others (Astington & Barriault, 2001). Theory-of-Mind study began with Piaget and continued through a tradition of meta-cognitive research projects (Flavell, 2004). A study by Baron-Cohen, Leslie and Frith (1985) of Theory-of-Mind abilities in atypically developing children reported major difficulties experienced by children with autism spectrum disorder (ASD) in imputing mental states to others. Since then, a wide range of follow-up research has been conducted to confirm these results. Traditional Theory-of-Mind research on ASD has been based on an either-or assumption that Theory-of-Mind is something one either possesses or does not. However, this approach fails to take account of how the ASD population themselves experience Theory-of-Mind. This paper suggests an alternative approach, Theory-of-Mind continuum model, to understand the Theory-of-Mind experience of people with ASD. The Theory-of-Mind continuum model will be developed through a comparison of subjective and objective aspects of mind, and phenomenal and psychological concepts of mind. This paper will demonstrate the importance of balancing qualitative and quantitative research methods in investigating the minds of people with ASD. It will enrich our theoretical understanding of Theory-of-Mind, as well as contain methodological implications for further studies in Theory-of-Mind
Resumo:
Over recent years, Unmanned Air Vehicles or UAVs have become a powerful tool for reconnaissance and surveillance tasks. These vehicles are now available in a broad size and capability range and are intended to fly in regions where the presence of onboard human pilots is either too risky or unnecessary. This paper describes the formulation and application of a design framework that supports the complex task of multidisciplinary design optimisation of UAVs systems via evolutionary computation. The framework includes a Graphical User Interface (GUI), a robust Evolutionary Algorithm optimiser named HAPEA, several design modules, mesh generators and post-processing capabilities in an integrated platform. These population –based algorithms such as EAs are good for cases problems where the search space can be multi-modal, non-convex or discontinuous, with multiple local minima and with noise, and also problems where we look for multiple solutions via Game Theory, namely a Nash equilibrium point or a Pareto set of non-dominated solutions. The application of the methodology is illustrated on conceptual and detailed multi-criteria and multidisciplinary shape design problems. Results indicate the practicality and robustness of the framework to find optimal shapes and trade—offs between the disciplinary analyses and to produce a set of non dominated solutions of an optimal Pareto front to the designer.
Resumo:
Technology and Nursing Practice explains and critically engages with the practice implications of technology for nursing. It takes a broad view of technology, covering not only health informatics, but also 'tele-nursing' and the use of equipment in clinical practice.
Resumo:
The popularity of social networking sites (SNSs) among adolescents has grown exponentially, with little accompanying research to understand the influences on adolescent engagement with this technology. The current study tested the validity of an extended theory of planned behaviour model (TPB), incorporating the additions of group norm and self-esteem influences, to predict frequent SNS use. Adolescents (N = 160) completed measures assessing the standard TPB constructs of attitude, subjective norm, perceived behavioural control (PBC), and intention, as well as group norm and self-esteem. One week later, participants reported their SNS use during the previous week. Support was found for the standard TPB variables of attitude and PBC, as well as group norm, in predicting intentions to use SNS frequently, with intention, in turn, predicting behaviour. These findings provide an understanding of the factors influencing frequent engagement in what is emerging as a primary tool for adolescent socialisation.
Resumo:
This thesis discusses various aspects of the integrity monitoring of GPS applied to civil aircraft navigation in different phases of flight. These flight phases include en route, terminal, non-precision approach and precision approach. The thesis includes four major topics: probability problem of GPS navigation service, risk analysis of aircraft precision approach and landing, theoretical analysis of Receiver Autonomous Integrity Monitoring (RAIM) techniques and RAIM availability, and GPS integrity monitoring at a ground reference station. Particular attention is paid to the mathematical aspects of the GPS integrity monitoring system. The research has been built upon the stringent integrity requirements defined by civil aviation community, and concentrates on the capability and performance investigation of practical integrity monitoring systems with rigorous mathematical and statistical concepts and approaches. Major contributions of this research are: • Rigorous integrity and continuity risk analysis for aircraft precision approach. Based on the joint probability density function of the affecting components, the integrity and continuity risks of aircraft precision approach with DGPS were computed. This advanced the conventional method of allocating the risk probability. • A theoretical study of RAIM test power. This is the first time a theoretical study on RAIM test power based on the probability statistical theory has been presented, resulting in a new set of RAIM criteria. • Development of a GPS integrity monitoring and DGPS quality control system based on GPS reference station. A prototype of GPS integrity monitoring and DGPS correction prediction system has been developed and tested, based on the A USN A V GPS base station on the roof of QUT ITE Building.
Resumo:
The concept of radar was developed for the estimation of the distance (range) and velocity of a target from a receiver. The distance measurement is obtained by measuring the time taken for the transmitted signal to propagate to the target and return to the receiver. The target's velocity is determined by measuring the Doppler induced frequency shift of the returned signal caused by the rate of change of the time- delay from the target. As researchers further developed conventional radar systems it become apparent that additional information was contained in the backscattered signal and that this information could in fact be used to describe the shape of the target itself. It is due to the fact that a target can be considered to be a collection of individual point scatterers, each of which has its own velocity and time- delay. DelayDoppler parameter estimation of each of these point scatterers thus corresponds to a mapping of the target's range and cross range, thus producing an image of the target. Much research has been done in this area since the early radar imaging work of the 1960s. At present there are two main categories into which radar imaging falls. The first of these is related to the case where the backscattered signal is considered to be deterministic. The second is related to the case where the backscattered signal is of a stochastic nature. In both cases the information which describes the target's scattering function is extracted by the use of the ambiguity function, a function which correlates the backscattered signal in time and frequency with the transmitted signal. In practical situations, it is often necessary to have the transmitter and the receiver of the radar system sited at different locations. The problem in these situations is 'that a reference signal must then be present in order to calculate the ambiguity function. This causes an additional problem in that detailed phase information about the transmitted signal is then required at the receiver. It is this latter problem which has led to the investigation of radar imaging using time- frequency distributions. As will be shown in this thesis, the phase information about the transmitted signal can be extracted from the backscattered signal using time- frequency distributions. The principle aim of this thesis was in the development, and subsequent discussion into the theory of radar imaging, using time- frequency distributions. Consideration is first given to the case where the target is diffuse, ie. where the backscattered signal has temporal stationarity and a spatially white power spectral density. The complementary situation is also investigated, ie. where the target is no longer diffuse, but some degree of correlation exists between the time- frequency points. Computer simulations are presented to demonstrate the concepts and theories developed in the thesis. For the proposed radar system to be practically realisable, both the time- frequency distributions and the associated algorithms developed must be able to be implemented in a timely manner. For this reason an optical architecture is proposed. This architecture is specifically designed to obtain the required time and frequency resolution when using laser radar imaging. The complex light amplitude distributions produced by this architecture have been computer simulated using an optical compiler.
Resumo:
Throughout the twentieth century increased interest in the training of actors resulted in the emergence of a plethora of acting theories and innovative theatrical movements in Europe, the UK and the USA. The individuals or groups involved with the formulation of these theories and movements developed specific terminologies, or languages of acting, in an attempt to clearly articulate the nature and the practice of acting according to their particular pedagogy or theatrical aesthetic. Now at the dawning of the twenty-first century, Australia boasts quite a number of schools and university courses professing to train actors. This research aims to discover the language used in actor training on the east coast of Australia today. Using interviews with staff of the National Institute of Dramatic Art, the Victorian College of the Arts, and the Queensland University of Technology as the primary source of data, a constructivist grounded theory has emerged to assess the influence of last century‟s theatrical theorists and practitioners on Australian training and to ascertain the possibility of a distinctly Australian language of acting.