861 resultados para Potential theory (Mathematics).
Resumo:
Bródy András kutatásainak egyik központi témaköre a gazdasági mozgás vizsgálata volt. Írásunkban Bródy elméletét kívánjuk röviden áttekinteni és összefoglalni. A termelés sokszektoros leírása egyben árelméletét (értékelméletét, méréselméletét) is keretbe foglalja. Ebben a keretben a gazdasági mozgás összetett ingadozása technológiai alapon elemezhető. Bródy megközelítésében a gazdasági ciklust nem külső megrázkódások magyarázzák, hanem a termelési rendszer belső arányai és kapcsolatai. A termelési struktúrát az árak és a volumenek egyformán alakítják, ezek között nincsen kitüntetett vagy domináns tényező. Az árak és a volumenek a köztük lévő duális kapcsolatban alakulnak ki. A gazdaság mozgásegyenleteit technológiai mérlegösszefüggések, valamint a piaci csere útján a gazdaságban újraelosztásra (újratermelésre) kerülő termékek felhasználása és az eszközlekötés változása írja le. Az így meghatározott mozgásegyenletek a gazdaság természetes mozgását ciklusmozgás alakjában írják le. A technológia vagy az értékviszonyok megváltozása (sokkok) a gazdaság ciklikus mozgásának megváltozásában tükröződik. Bródy munkáiban technológiai megalapozást nyer a történelemből ismert számos jellegzetes gazdasági ciklus. / === / Economic motion and dynamics are at the heart of Andras Brody's creative output. This paper attempts a bird's-eye view of his theory of economic cycles. Brody's multi-sector modelling of production has provided a framework for price theory (the theory of value and measurement). His theory of economic motion with cyclical characteristics is technology driven. It argues that the complex web of economic cycles is determined by the proportions and interrelationships of the system of production, not by arbitrary external shocks. The structure's behaviour are driven by prices and proportions, with the duality of prices and proportions as a dominant feature. These are features in common with the Leontief models, which Brody extended to economic cycles. Brody saw economic cycles as natural motions of economic systems with accumulated assets (time lags) and market exchange of goods (demand and supply adjustment). Changes in technology or valuations (shocks) are reflected in changing patterns of motion. His model of the economy is a fine instrument that enabled him to show how the technological parameters of its system determine the frequency and other characteristics of various economic cycles identified in economic history.
Resumo:
A közgazdaság-tudomány számos problémája a fizika analóg modelljeinek segítségével nyert megoldást. A közgazdászok körében erőteljesen megoszlanak a vélemények, hogy a közgazdasági modellek mennyire redukálhatók a fizika, vagy más természettudományok eredményeire. Vannak,akik pontosan ezzel magyarázzák,hogy a mai mainstream közgazdasági elmélet átalakult alkalmazott matematikává,ami a gazdasági kérdéseket csak a társadalom-tudományi vonatkozásaitól eltekintve képes vizsgálni. Mások, e tanulmányszerzője is, viszont úgy vélekednek, hogy a közgazdasági problémák egy része, ahol lehetőség van a mérésre, jól modellezhetők a természettudományok technikai arzenáljával. A másik része, amelyekben nem lehet mérni,s tipikusan ilyenek a társadalomtudományi kérdések, ott sokkal komplexebb technikákra lesz szükség. Etanulmány célkitűzése, hogy felvázolja a fizika legújabb, az irreverzibilis dinamika, a relativitáselmélet és a kvantummechanika sztochasztikus matematikai összefüggéseit, amelyekből a közgazdászok választhatnak egy-egy probléma megfogalmazásában és megoldásában. Például az időoperátorok pontos értelmezése jelentős fordulatot hozhat a makroökonómiai elméletekben; vagy az eddigi statikus egyensúlyi referencia pontokat felválthatják a dinamikus,időben változó sztochasztikus egyensúlyi referenciafüggvények, ami forradalmian új megvilágításba helyezhet számos társadalomtudományi, s főleg nemegyensúlyi közgazdasági kérdést.A termodinamika és a biológiai evolúció fogalmait és definícióit Paul A. Samuelson (1947) már adaptálta a közgazdaságtanban, viszont a kvantummechanika legújabb eredményeit, az időoperátorokat stb. nem érintette. E cikk azokat a legújabb fizikai, kémiai és biológiai matematikai összefüggéseket foglalja össze,amelyek hasznosak lehetnek a közgazdasági modellek komplexebb megfogalmazásához. ___________________ The aim of this paper is to out line the newest results of physics,i.e.,the stochastic mathematical relations of relativity theory and quantum mechanics as well as irreversible dynamics which can be applied for some economic problems.For example,the correct interpretation of time operators using for the macroeconomic theories may provide a serious improvement in approach to the reality.The stochastic dynamic equilibrium reference functions will take over the role of recent static equilibrium reference points,which may also reveal some nonequilibrium questions of macroeconomics.The concepts and definitions of thermodynamics and biological evolution have been adopted in economics by Paul A. Samuelson, but he did not concern the newest results of quantum mechanics, e.g., the time operators. Now we do it.In addition, following Samuelson,we show that von Neumann growth model cannot be explained as a peculiar extension of thermodynamic irreversibility.
Resumo:
Savings and investments in the American money market by emerging countries, primarily China, financed the excessive consumption of the United States in the early 2000s, which indirectly led to a global financial crisis. The crisis started from the real estate mortgage market. Such balance disrupting processes began on the American financial market which contradicted all previously known equilibrium theories of every school of economics. Economics has yet to come up with models or empirical theories for this new disequilibrium. This is why the outbreak of the crisis could not be prevented or at least predicted. The question is, to what extent can existing market theories, calculation methods and the latest financial products be held responsible for the new situation. This paper studies the influence of the efficient market and modern portfolio theory, as well as Li’s copula function on the American investment market. Naturally, the issues of moral risks and greed, credit ratings and shareholder control, limited liability and market regulations are aspects, which cannot be ignored. In summary, the author outlines the potential alternative measures that could be applied to prevent a new crisis, defines the new directions of economic research and draws the conclusion for the Hungarian economic policy.
Resumo:
The aim of this paper is to survey the game theory modelling of the behaviour of global players in mitigation and adaptation related to climate change. Three main fields are applied for the specific aspects of temperature rise: behaviour games, CPR problem and negotiation games. The game theory instruments are useful in analyzing strategies in uncertain circumstances, such as the occurrence and impacts of climate change. To analyze the international players’ relations, actions, attitude toward carbon emission, negotiation power and motives, several games are applied for the climate change in this paper. The solution is surveyed, too, for externality problem.
Resumo:
The study provides an overview of the application possibilities of game theory to climate change. The characteristics of games are adapted to the topics of climate and carbon. The importance of uncertainty, probability, marginal value of adaptation, common pool resources, etc. are tailored to the context of international relations and the challenge of global warming.
Resumo:
This study examined the effectiveness of intelligent tutoring system instruction, grounded in John Anderson's ACT theory of cognition, on the achievement and attitude of developmental mathematics students in the community college setting. The quasi-experimental research used a pretest-posttest control group design. The dependent variables were problem solving achievement, overall achievement, and attitude towards mathematics. The independent variable was instructional method.^ Four intact classes and two instructors participated in the study for one semester. Two classes (n = 35) served as experimental groups; they received six lessons with real-world problems using intelligent tutoring system instruction. The other two classes (n = 24) served as control groups; they received six lessons with real-world problems using traditional instruction including graphing calculator support. It was hypothesized that students taught problem solving using the intelligent tutoring system would achieve more on the dependent variables than students taught without the intelligent tutoring system.^ Posttest mean scores for one teacher produced a significant difference in overall achievement for the experimental group. The same teacher had higher means, not significantly, for the experimental group in problem solving achievement. The study did not indicate a significant difference in attitude mean scores.^ It was concluded that using an intelligent tutoring system in problem solving instruction may impact student's overall mathematics achievement and problem solving achievement. Other factors must be considered, such as the teacher's classroom experience, the teacher's experience with the intelligent tutoring system, trained technical support, and trained student support; as well as student learning styles, motivation, and overall mathematics ability. ^
Resumo:
The National Council Licensure Examination for Registered Nurses (NCLEX-RN) is the examination that all graduates of nursing education programs must pass to attain the title of registered nurse. Currently the NCLEX-RN passing rate is at an all-time low (81%) for first-time test takers (NCSBN, 2004); amidst a nationwide shortage of registered nurses (Glabman, 2001). Because of the critical need to supply greater numbers of professional nurses, and the potential accreditation ramifications that low NCLEX-RN passing rates can have on schools of nursing and graduates, this research study tests the effectiveness of a predictor model. This model is based upon the theoretical framework of McClusky's (1959) theory of margin (ToM), with the hope that students found to be at-risk for NCLEX-RN failure can be identified and remediated prior to taking the actual licensure examination. To date no theory based predictor model has been identified that predicts success on the NCLEX-RN. ^ The model was tested using prerequisite course grades, nursing course grades and scores on standardized examinations for the 2003 associate degree nursing graduates at a urban community college (N = 235). Success was determined through the reporting of pass on the NCLEX-RN examination by the Florida Board of Nursing. Point biserial correlations tested model assumptions regarding variable relationships, while logistic regression was used to test the model's predictive power. ^ Correlations among variables were significant and the model accounted for 66% of variance in graduates' success on the NCLEX-RN with 98% prediction accuracy. Although certain prerequisite course grades and nursing course grades were found to be significant to NCLEX-RN success, the overall model was found to be most predictive at the conclusion of the academic program of study. The inclusion of the RN Assessment Examination, taken during the final semester of course work, was the most significant predictor of NCLEX-RN success. Success on the NCLEX-RN allows graduates to work as registered nurses, reflects positively on a school's academic performance record, and supports the appropriateness of the educational program's goals and objectives. The study's findings support potential other uses of McClusky's theory of margin as a predictor of program outcome in other venues of adult education. ^
Resumo:
English has been taught as a core and compulsory subject in China for decades. Recently, the demand for English in China has increased dramatically. China now has the world's largest English-learning population. The traditional English-teaching method cannot continue to be the only approach because it merely focuses on reading, grammar and translation, which cannot meet English learners and users' needs (i.e., communicative competence and skills in speaking and writing). ^ This study was conducted to investigate if the Picture-Word Inductive Model (PWIM), a new pedagogical method using pictures and inductive thinking, would benefit English learners in China in terms of potential higher output in speaking and writing. With the gauge of Cognitive Load Theory (CLT), specifically, its redundancy effect, I investigated whether processing words and a picture concurrently would present a cognitive overload for English learners in China. ^ I conducted a mixed methods research study. A quasi-experiment (pretest, intervention for seven weeks, and posttest) was conducted using 234 students in four groups in Lianyungang, China (58 fourth graders and 57 seventh graders as an experimental group with PWIM and 59 fourth graders and 60 seventh graders as a control group with the traditional method). No significant difference in the effects of PWIM was found on vocabulary acquisition based on grade levels. Observations, questionnaires with open-ended questions, and interviews were deployed to answer the three remaining research questions. A few students felt cognitively overloaded when they encountered too many writing samples, too many new words at one time, repeated words, mismatches between words and pictures, and so on. Many students listed and exemplified numerous strengths of PWIM, but a few mentioned weaknesses of PWIM. The students expressed the idea that PWIM had a positive effect on their English teaching. ^ As integrated inferences, qualitative findings were used to explain the quantitative results that there were no significant differences of the effects of the PWIM between the experimental and control groups in both grade levels, from four contextual aspects: time constraints on PWIM implementation, teachers' resistance, how to use PWIM and PWIM implemented in a classroom over 55 students.^
Resumo:
This study examined the effectiveness of intelligent tutoring system instruction, grounded in John Anderson's ACT theory of cognition, on the achievement and attitude of developmental mathematics students in the community college setting. The quasi-experimental research used a pretest-posttest control group design. The dependent variables were problem solving achievement, overall achievement, and attitude towards mathematics. The independent variable was instructional method. Four intact classes and two instructors participated in the study for one semester. Two classes (n = 35) served as experimental groups; they received six lessons with real-world problems using intelligent tutoring system instruction. The other two classes (n = 24) served as control groups; they received six lessons with real-world problems using traditional instruction including graphing calculator support. It was hypothesized that students taught problem solving using the intelligent tutoring system would achieve more on the dependent variables than students taught without the intelligent tutoring system. Posttest mean scores for one teacher produced a significant difference in overall achievement for the experimental group. The same teacher had higher means, not significantly, for the experimental group in problem solving achievement. The study did not indicate a significant difference in attitude mean scores. It was concluded that using an intelligent tutoring system in problem solving instruction may impact student's overall mathematics achievement and problem solving achievement. Other factors must be considered, such as the teacher's classroom experience, the teacher's experience with the intelligent tutoring system, trained technical support, and trained student support; as well as student learning styles, motivation, and overall mathematics ability.
Resumo:
The purpose of this study was to examine the effects of the use of technology on students’ mathematics achievement, particularly the Florida Comprehensive Assessment Test (FCAT) mathematics results. Eleven schools within the Miami-Dade County Public School System participated in a pilot program on the use of Geometers Sketchpad (GSP). Three of these schools were randomly selected for this study. Each school sent a teacher to a summer in-service training program on how to use GSP to teach geometry. In each school, the GSP class and a traditional geometry class taught by the same teacher were the study participants. Students’ mathematics FCAT results were examined to determine if the GSP produced any effects. Students’ scores were compared based on assignment to the control or experimental group as well as gender and SES. SES measurements were based on whether students qualified for free lunch. The findings of the study revealed a significant difference in the FCAT mathematics scores of students who were taught geometry using GSP compared to those who used the traditional method. No significant differences existed between the FCAT mathematics scores of the students based on SES. Similarly, no significant differences existed between the FCAT scores based on gender. In conclusion, the use of technology (particularly GSP) is likely to boost students’ FCAT mathematics test scores. The findings also show that the use of GSP may be able to close known gender and SES related achievement gaps. The results of this study promote policy changes in the way geometry is taught to 10th grade students in Florida’s public schools.
Resumo:
Lettres àune Princesse d'Allemagne sur divers sujets de physique et de philosophie (Letters to a Princess of Germany on various topics of physics and philosophy) is the work taken as an object of study of this thesis. It is a literary success written in the eighteenth century by the Swiss mathematician and physicist Leonhard Paul Euler (1707-1783) in order to meet a request from the Prussian king, Frederick II, the Great (1712-1786) to accept to guide the intellectual education of his niece, the young princess Anhalt-Dessau (1745-1808). The method of teaching and learning through letters elected to the education of the German monarch resulted in a collection of 234 matches in which Euler theory is about music, Philosophy, Mechanics, Optics, Astronomy, Theology and Ethics among others. The research seeks to point out mathematical content contained in this reference work based on the exploitation and adaptation of original historical works as an articulator of development activities for teaching mathematics in basic education and in accordance with the National Curriculum Parameters of Mathematics (NCP) work. The general objective point out the limits and didactic potential of Lettres à une Princesse d'Allemagne sur divers sujets de physique et de philosophie as a source of support for teachers of basic education in developing activities for teaching mathematics. The discussions raised point to concrete possibilities of entanglement between the extracted mathematical content of the bulge of the work with current teaching methodologies from resizing the use of letters according to Freire's pedagogical perspective of the correspondence, and especially the use of new communication channels in the century XXI, both aimed at dialogue and approximation between those who write and those who read.
Resumo:
Restorative Justice (rj), a distinctive philosophical approach that seeks to replace punitive, managerial structures of schooling with those that emphasize the building and repairing of relationships (Hopkins 2004) has been embraced in the past two decades by a variety of school systems worldwide in an effort to build safe school communities. Early studies indicate rj holds significant promise, however, proponents in the field identify that theoretical and evidence-based research is falling behind practice. They call for further research to deepen the current understanding of rj that will support its sustainability and transformative potential and allow it to move from the margins to the mainstream of schooling (Braithwaite 2006; Morrison & Ahmed 2006; Sherman & Strang 2007).
Resumo:
This paper reports on an investigation with first year undergraduate Product Design and Management students within a School of Engineering. The students at the time of this investigation had studied fundamental engineering science and mathematics for one semester. The students were given an open ended, ill formed problem which involved designing a simple bridge to cross a river. They were given a talk on problem solving and given a rubric to follow, if they chose to do so. They were not given any formulae or procedures needed in order to resolve the problem. In theory, they possessed the knowledge to ask the right questions in order to make assumptions but, in practice, it turned out they were unable to link their a priori knowledge to resolve this problem. They were able to solve simple beam problems when given closed questions. The results show they were unable to visualise a simple bridge as an augmented beam problem and ask pertinent questions and hence formulate appropriate assumptions in order to offer resolutions.
Resumo:
In this work, we introduce the periodic nonlinear Fourier transform (PNFT) method as an alternative and efficacious tool for compensation of the nonlinear transmission effects in optical fiber links. In the Part I, we introduce the algorithmic platform of the technique, describing in details the direct and inverse PNFT operations, also known as the inverse scattering transform for periodic (in time variable) nonlinear Schrödinger equation (NLSE). We pay a special attention to explaining the potential advantages of the PNFT-based processing over the previously studied nonlinear Fourier transform (NFT) based methods. Further, we elucidate the issue of the numerical PNFT computation: we compare the performance of four known numerical methods applicable for the calculation of nonlinear spectral data (the direct PNFT), in particular, taking the main spectrum (utilized further in Part II for the modulation and transmission) associated with some simple example waveforms as the quality indicator for each method. We show that the Ablowitz-Ladik discretization approach for the direct PNFT provides the best performance in terms of the accuracy and computational time consumption.
Resumo:
The semantic model developed in this research was in response to the difficulty a group of mathematics learners had with conventional mathematical language and their interpretation of mathematical constructs. In order to develop the model ideas from linguistics, psycholinguistics, cognitive psychology, formal languages and natural language processing were investigated. This investigation led to the identification of four main processes: the parsing process, syntactic processing, semantic processing and conceptual processing. The model showed the complex interdependency between these four processes and provided a theoretical framework in which the behaviour of the mathematics learner could be analysed. The model was then extended to include the use of technological artefacts into the learning process. To facilitate this aspect of the research, the theory of instrumentation was incorporated into the semantic model. The conclusion of this research was that although the cognitive processes were interdependent, they could develop at different rates until mastery of a topic was achieved. It also found that the introduction of a technological artefact into the learning environment introduced another layer of complexity, both in terms of the learning process and the underlying relationship between the four cognitive processes.