780 resultados para Theories of Liability
Resumo:
Any theory of thinking or teaching or learning rests on an underlying philosophy of knowledge. Mathematics education is situated at the nexus of two fields of inquiry, namely mathematics and education. However, numerous other disciplines interact with these two fields which compound the complexity of developing theories that define mathematics education. We first address the issue of clarifying a philosophy of mathematics education before attempting to answer whether theories of mathematics education are constructible? In doing so we draw on the foundational writings of Lincoln and Guba (1994), in which they clearly posit that any discipline within education, in our case mathematics education, needs to clarify for itself the following questions: (1) What is reality? Or what is the nature of the world around us? (2) How do we go about knowing the world around us? [the methodological question, which presents possibilities to various disciplines to develop methodological paradigms] and, (3) How can we be certain in the “truth” of what we know? [the epistemological question]
Resumo:
Electronic Blocks are a new programming environment, designed specifically for children aged between three and eight years. As such, the design of the Electronic Block environment is firmly based on principles of developmentally appropriate practices in early childhood education. The Electronic Blocks are physical, stackable blocks that include sensor blocks, action blocks and logic blocks. Evaluation of the Electronic Blocks with both preschool and primary school children shows that the blocks' ease of use and power of engagement have created a compelling tool for the introduction of meaningful technology education in an early childhood setting. The key to the effectiveness of the Electronic Blocks lies in an adherence to theories of development and learning throughout the Electronic Blocks design process.
Resumo:
While the justice implications of climate change are well understood by the international climate regime, solutions to meaningfully address climate injustice are still emerging. This article explores how a number of different theories of justice have influenced the development of international climate regime policies and measures. Such analysis is undertaken by examining the theories of remedial justice, environmental justice, energy justice, social justice and international justice. This article demonstrates how each of these theories has influenced the development of international climate policies or measures. No one theory of justice has the ability to respond to the multifaceted justice implications that arise as a result of climate change. It is argued that a variety of lenses of justice are useful when examining issues of injustice in the climate context. It is believed that articulating the justice implications of climate change by reference to theories of justice assists in clarifying the key issues giving rise to injustice. This article finds that while there has been some progress by the regime in recognising the injustices associated with climate change, such recognition is piecemeal and the implementation of many of the policies and measures discussed within this article needs to be either scaled up, or extended into more far-reaching policies and measures to overcome climate justice concerns. Overall it is suggested that climate justice concerns need to be clearly enunciated within key adaptation instruments so as to provide a legal and legitimate basis upon which to leverage action.
Resumo:
This paper continues the conversation from recent articles examining potential remedies available for incorrect decisions by sports officials. In particular, this article focuses on bringing an action against an official in negligence for pure economic loss. Using precedent cases, it determines that such an action would have a low chance of success, as a duty of care would be difficult to establish. Even if that could be overcome, an aggrieved player or team would still face further hurdles at the stages of breach, causation and defences. The article concludes by proposing some options to further reduce the small risk of liability to officials.
Resumo:
A novel test of recent theories of the origin of optical activity has been designed based on the inclusion of certain alkyl 2-methylhexanoates into urea channels.
Resumo:
The problem of an infinite circular sandwich shell subjected to an a\isymmetric radial line load is investigated using three-dimensional elasticity theory, shell core method, and sandwich shell theory due to Fulton and Schmidt. A comparison of the stresses and displacements with an exact elasticity solution is carried out for various shell parameters in order to clearly bring out the limitations of sandwich shell theories of Fulton and Schmidt as well as the shell core solution.
Resumo:
This study examines different ways in which the concept of media pluralism has been theorized and used in contemporary media policy debates. Access to a broad range of different political views and cultural expressions is often regarded as a self-evident value in both theoretical and political debates on media and democracy. Opinions on the meaning and nature of media pluralism as a theoretical, political or empirical concept, however, are many, and it can easily be adjusted to different political purposes. The study aims to analyse the ambiguities surrounding the concept of media pluralism in two ways: by deconstructing its normative roots from the perspective of democratic theory, and by examining its different uses, definitions and underlying rationalities in current European media policy debates. The first part of the study examines the values and assumptions behind the notion of media pluralism in the context of different theories of democracy and the public sphere. The second part then analyses and assesses the deployment of the concept in contemporary European policy debates on media ownership and public service media. Finally, the study critically evaluates various attempts to create empirical indicators for measuring media pluralism and discusses their normative implications and underlying rationalities. The analysis of contemporary policy debates indicates that the notion of media pluralism has been too readily reduced to an empty catchphrase or conflated with consumer choice and market competition. In this narrow technocratic logic, pluralism is often unreflectively associated with quantitative data in a way that leaves unexamined key questions about social and political values, democracy, and citizenship. The basic argument advanced in the study is that media pluralism needs to be rescued from its depoliticized uses and re-imagined more broadly as a normative value that refers to the distribution of communicative power in the public sphere. Instead of something that could simply be measured through the number of media outlets available, the study argues that media pluralism should be understood in terms of its ability to challenge inequalities in communicative power and create a more democratic public sphere.
Resumo:
We have shown that the general theories of metals and semiconductors can be employed to understand the diameter and voltage dependency of current through metallic and semiconducting carbon nanotubes, respectively. The current through a semiconducting multiwalled carbon nanotube (MWCNT) is associated with the energy gap that is different for different shells. The contribution of the outermost shell is larger as compared to the inner shells. The general theories can also explain the diameter dependency of maximum current through nanotubes. We have also compared the current carrying ability of a MWCNT and an array of the same diameter of single wall carbon nanotubes (SWCNTs) and found that MWCNTs are better suited and deserve further investigation for possible applications as interconnects.
Resumo:
In the quest for a descriptive theory of decision-making, the rational actor model in economics imposes rather unrealistic expectations and abilities on human decision makers. The further we move from idealized scenarios, such as perfectly competitive markets, and ambitiously extend the reach of the theory to describe everyday decision making situations, the less sense these assumptions make. Behavioural economics has instead proposed models based on assumptions that are more psychologically realistic, with the aim of gaining more precision and descriptive power. Increased psychological realism, however, comes at the cost of a greater number of parameters and model complexity. Now there are a plethora of models, based on different assumptions, applicable in differing contextual settings, and selecting the right model to use tends to be an ad-hoc process. In this thesis, we develop optimal experimental design methods and evaluate different behavioral theories against evidence from lab and field experiments.
We look at evidence from controlled laboratory experiments. Subjects are presented with choices between monetary gambles or lotteries. Different decision-making theories evaluate the choices differently and would make distinct predictions about the subjects' choices. Theories whose predictions are inconsistent with the actual choices can be systematically eliminated. Behavioural theories can have multiple parameters requiring complex experimental designs with a very large number of possible choice tests. This imposes computational and economic constraints on using classical experimental design methods. We develop a methodology of adaptive tests: Bayesian Rapid Optimal Adaptive Designs (BROAD) that sequentially chooses the "most informative" test at each stage, and based on the response updates its posterior beliefs over the theories, which informs the next most informative test to run. BROAD utilizes the Equivalent Class Edge Cutting (EC2) criteria to select tests. We prove that the EC2 criteria is adaptively submodular, which allows us to prove theoretical guarantees against the Bayes-optimal testing sequence even in the presence of noisy responses. In simulated ground-truth experiments, we find that the EC2 criteria recovers the true hypotheses with significantly fewer tests than more widely used criteria such as Information Gain and Generalized Binary Search. We show, theoretically as well as experimentally, that surprisingly these popular criteria can perform poorly in the presence of noise, or subject errors. Furthermore, we use the adaptive submodular property of EC2 to implement an accelerated greedy version of BROAD which leads to orders of magnitude speedup over other methods.
We use BROAD to perform two experiments. First, we compare the main classes of theories for decision-making under risk, namely: expected value, prospect theory, constant relative risk aversion (CRRA) and moments models. Subjects are given an initial endowment, and sequentially presented choices between two lotteries, with the possibility of losses. The lotteries are selected using BROAD, and 57 subjects from Caltech and UCLA are incentivized by randomly realizing one of the lotteries chosen. Aggregate posterior probabilities over the theories show limited evidence in favour of CRRA and moments' models. Classifying the subjects into types showed that most subjects are described by prospect theory, followed by expected value. Adaptive experimental design raises the possibility that subjects could engage in strategic manipulation, i.e. subjects could mask their true preferences and choose differently in order to obtain more favourable tests in later rounds thereby increasing their payoffs. We pay close attention to this problem; strategic manipulation is ruled out since it is infeasible in practice, and also since we do not find any signatures of it in our data.
In the second experiment, we compare the main theories of time preference: exponential discounting, hyperbolic discounting, "present bias" models: quasi-hyperbolic (α, β) discounting and fixed cost discounting, and generalized-hyperbolic discounting. 40 subjects from UCLA were given choices between 2 options: a smaller but more immediate payoff versus a larger but later payoff. We found very limited evidence for present bias models and hyperbolic discounting, and most subjects were classified as generalized hyperbolic discounting types, followed by exponential discounting.
In these models the passage of time is linear. We instead consider a psychological model where the perception of time is subjective. We prove that when the biological (subjective) time is positively dependent, it gives rise to hyperbolic discounting and temporal choice inconsistency.
We also test the predictions of behavioral theories in the "wild". We pay attention to prospect theory, which emerged as the dominant theory in our lab experiments of risky choice. Loss aversion and reference dependence predicts that consumers will behave in a uniquely distinct way than the standard rational model predicts. Specifically, loss aversion predicts that when an item is being offered at a discount, the demand for it will be greater than that explained by its price elasticity. Even more importantly, when the item is no longer discounted, demand for its close substitute would increase excessively. We tested this prediction using a discrete choice model with loss-averse utility function on data from a large eCommerce retailer. Not only did we identify loss aversion, but we also found that the effect decreased with consumers' experience. We outline the policy implications that consumer loss aversion entails, and strategies for competitive pricing.
In future work, BROAD can be widely applicable for testing different behavioural models, e.g. in social preference and game theory, and in different contextual settings. Additional measurements beyond choice data, including biological measurements such as skin conductance, can be used to more rapidly eliminate hypothesis and speed up model comparison. Discrete choice models also provide a framework for testing behavioural models with field data, and encourage combined lab-field experiments.
Resumo:
We investigate the generalized second law of thermodynamics (GSL) in generalized theories of gravity. We examine the total entropy evolution with time including the horizon entropy, the non-equilibrium entropy production, and the entropy of all matter, field and energy components. We derive a universal condition to protect the generalized second law and study its validity in different gravity theories. In Einstein gravity (even in the phantom-dominated universe with a Schwarzschild black hole), Lovelock gravity and braneworld gravity, we show that the condition to keep the GSL can always be satisfied. In f ( R) gravity and scalar-tensor gravity, the condition to protect the GSL can also hold because the temperature should be positive, gravity is always attractive and the effective Newton constant should be an approximate constant satisfying the experimental bounds.
Resumo:
The costs of inter- and intra-regional diversification have been widely discussed in the existing international business literature, but the findings are mixed. Explanations for the mixed findings have important managerial implications, because business managers have to estimate accurately the costs of doing business within and across regions before they make their internationalization decisions. To explain the existing mixed findings, this study differentiates between liabilities of foreignness at the country and regional levels, and explores the joint effects of liability of country foreignness (LCF) and liability of regional foreignness (LRF) on the performance of internationalizing firms. Using data from 167 Canadian firms, we find that LCF may not necessarily be negatively correlated with intra-regional diversification, but LRF is positively correlated with inter-regional diversification. LCF moderates the relationship between LRF and inter-regional diversification, and also mediates the relationship between intra-regional diversification and firm performance. LRF mediates the relationship between inter-regional diversification and firm performance. Missing one or more of these variables may result in different cost estimates. Identification of the relationships between these variables helps to improve the accuracy of estimating the costs of doing business aboard.
Resumo:
We study the local properties of a class of codimension-2 defects of the 6d N = (2, 0) theories of type J = A, D, E labeled by nilpotent orbits of a Lie algebra $g, where g is determined by J and the outer-automorphism twist around the defect. This class is a natural generalization of the defects of the six-dimensional (6d) theory of type SU(N) labeled by a Young diagram with N boxes. For any of these defects, we determine its contribution to the dimension of the Higgs branch, to the Coulomb branch operators and their scaling dimensions, to the four-dimensional (4d) central charges a and c and to the flavor central charge k. © 2013 World Scientific Publishing Company.
Resumo:
The focus of this study is on questioning whether the traditional theories of internationalization are adequate to explain the international expansion of multinationals from emerging countries. Looking forward on this issue, we investigate the internationalization strategies adopted by JBS, a Brazilian multinational of the beef industry. The results show that the company adopted two of the five generic strategies specific to the context of emerging countries suggested by Ramamurti and Singh (2009): global consolidator and vertical integrator. Moreover, when analyzing the internationalization of the company under study, the speed of the process is highlighted when compared to traditional multinationals. It is concluded that the main mode of entry that allowed the international expansion was the acquisition and that this strategy has advantages to the company, such as access to strategic resources and rapid growth, possibly overcoming the liability of foreignness, the opportunity to compete globally and the diversification of segments that generate synergies to the company's activities.
Resumo:
This paper presents a review of financial economics literature and offers a comprehensive discussion and systematisation of determinants of financial capital use. In congruence with modern financial literature, it is acknowledged here that real and financial capital decisions are interdependent. While the fundamental role of the (unconstrained) demand for real capital in the demand for finance is acknowledged, the deliverable focuses on three complementary categories of the determinants of financial capital use: i) capital market imperfections; ii) factors mitigating these imperfections or their impacts; and iii) firm- and sector-related factors, which alter the severity of financial constraints and their effects. To address the question of the optimal choice of financial instruments, theories of firm capital structure are reviewed. The deliverable concludes with theory-derived implications for agricultural and non-agricultural rural business’ finance.
Resumo:
Electronic Blocks are a new programming environment, designed specifically for children aged between three and eight years. As such, the design of the Electronic Block environment is firmly based on principles of developmentally appropriate practices in early childhood education. The Electronic Blocks are physical, stackable blocks that include sensor blocks, action blocks and logic blocks. Evaluation of the Electronic Blocks with both preschool and primary school children shows that the blocks' ease of use and power of engagement have created a compelling tool for the introduction of meaningful technology education in an early childhood setting. The key to the effectiveness of the Electronic Blocks lies in an adherence to theories of development and learning throughout the Electronic Blocks design process.