837 resultados para Theory of Business
Resumo:
This paper introduces a new model of exchange: networks, rather than markets, of buyers and sellers. It begins with the empirically motivated premise that a buyer and seller must have a relationship, a "link," to exchange goods. Networks - buyers, sellers, and the pattern of links connecting them - are common exchange environments. This paper develops a methodology to study network structures and explains why agents may form networks. In a model that captures characteristics of a variety of industries, the paper shows that buyers and sellers, acting strategically in their own self-interests, can form the network structures that maximize overall welfare.
A mathematical theory of stochastic microlensing. II. Random images, shear, and the Kac-Rice formula
Resumo:
Continuing our development of a mathematical theory of stochastic microlensing, we study the random shear and expected number of random lensed images of different types. In particular, we characterize the first three leading terms in the asymptotic expression of the joint probability density function (pdf) of the random shear tensor due to point masses in the limit of an infinite number of stars. Up to this order, the pdf depends on the magnitude of the shear tensor, the optical depth, and the mean number of stars through a combination of radial position and the star's mass. As a consequence, the pdf's of the shear components are seen to converge, in the limit of an infinite number of stars, to shifted Cauchy distributions, which shows that the shear components have heavy tails in that limit. The asymptotic pdf of the shear magnitude in the limit of an infinite number of stars is also presented. All the results on the random microlensing shear are given for a general point in the lens plane. Extending to the general random distributions (not necessarily uniform) of the lenses, we employ the Kac-Rice formula and Morse theory to deduce general formulas for the expected total number of images and the expected number of saddle images. We further generalize these results by considering random sources defined on a countable compact covering of the light source plane. This is done to introduce the notion of global expected number of positive parity images due to a general lensing map. Applying the result to microlensing, we calculate the asymptotic global expected number of minimum images in the limit of an infinite number of stars, where the stars are uniformly distributed. This global expectation is bounded, while the global expected number of images and the global expected number of saddle images diverge as the order of the number of stars. © 2009 American Institute of Physics.
Resumo:
Hannah Arendt's theory of political judgment has been an ongoing perplexity among scholars who have written on her. As a result, her theory of judgment is often treated as a suggestive but unfinished aspect of her thought. Drawing on a wider array of sources than is commonly utilized, I argue that her theory of political judgment was in fact the heart of her work. Arendt's project, in other words, centered around reestablishing the possibility of political judgment in a modern world that historically has progressively undermined it. In the dissertation, I systematically develop an account of Arendt's fundamentally political and non-sovereign notion of judgment. We discover that individual judgment is not arbitrary, and that even in the complex circumstances of the modern world there are valid structures of judgment which can be developed and dependably relied upon. The result of this work articulates a theory of practical reason which is highly compelling: it provides orientation for human agency which does not rob it of its free and spontaneous character; shows how we can improve and cultivate our political judgment; and points the way toward the profoundly intersubjective form of political philosophy Arendt ultimately hoped to develop.
Resumo:
We present a theory of hypoellipticity and unique ergodicity for semilinear parabolic stochastic PDEs with "polynomial" nonlinearities and additive noise, considered as abstract evolution equations in some Hilbert space. It is shown that if Hörmander's bracket condition holds at every point of this Hilbert space, then a lower bound on the Malliavin covariance operatorμt can be obtained. Informally, this bound can be read as "Fix any finite-dimensional projection on a subspace of sufficiently regular functions. Then the eigenfunctions of μt with small eigenvalues have only a very small component in the image of Π." We also show how to use a priori bounds on the solutions to the equation to obtain good control on the dependency of the bounds on the Malliavin matrix on the initial condition. These bounds are sufficient in many cases to obtain the asymptotic strong Feller property introduced in [HM06]. One of the main novel technical tools is an almost sure bound from below on the size of "Wiener polynomials," where the coefficients are possibly non-adapted stochastic processes satisfying a Lips chitz condition. By exploiting the polynomial structure of the equations, this result can be used to replace Norris' lemma, which is unavailable in the present context. We conclude by showing that the two-dimensional stochastic Navier-Stokes equations and a large class of reaction-diffusion equations fit the framework of our theory.
Resumo:
An event memory is a mental construction of a scene recalled as a single occurrence. It therefore requires the hippocampus and ventral visual stream needed for all scene construction. The construction need not come with a sense of reliving or be made by a participant in the event, and it can be a summary of occurrences from more than one encoding. The mental construction, or physical rendering, of any scene must be done from a specific location and time; this introduces a "self" located in space and time, which is a necessary, but need not be a sufficient, condition for a sense of reliving. We base our theory on scene construction rather than reliving because this allows the integration of many literatures and because there is more accumulated knowledge about scene construction's phenomenology, behavior, and neural basis. Event memory differs from episodic memory in that it does not conflate the independent dimensions of whether or not a memory is relived, is about the self, is recalled voluntarily, or is based on a single encoding with whether it is recalled as a single occurrence of a scene. Thus, we argue that event memory provides a clearer contrast to semantic memory, which also can be about the self, be recalled voluntarily, and be from a unique encoding; allows for a more comprehensive dimensional account of the structure of explicit memory; and better accounts for laboratory and real-world behavioral and neural results, including those from neuropsychology and neuroimaging, than does episodic memory.
Resumo:
Lennart Åqvist (1992) proposed a logical theory of legal evidence, based on the Bolding-Ekelöf of degrees of evidential strength. This paper reformulates Åqvist's model in terms of the probabilistic version of the kappa calculus. Proving its acceptability in the legal context is beyond the present scope, but the epistemological debate about Bayesian Law isclearly relevant. While the present model is a possible link to that lineof inquiry, we offer some considerations about the broader picture of thepotential of AI & Law in the evidentiary context. Whereas probabilisticreasoning is well-researched in AI, calculations about the threshold ofpersuasion in litigation, whatever their value, are just the tip of theiceberg. The bulk of the modeling desiderata is arguably elsewhere, if one isto ideally make the most of AI's distinctive contribution as envisaged forlegal evidence research.
Resumo:
Tony Mann provides a review of the book: Theory of Games and Economic Behavior, John von Neumann and Oskar Morgenstern, Princeton University Press, 1944.
Resumo:
A cross-domain workflow application may be constructed using a standard reference model such as the one by the Workflow Management Coalition (WfMC) [7] but the requirements for this type of application are inherently different from one organization to another. The existing models and systems built around them meet some but not all the requirements from all the organizations involved in a collaborative process. Furthermore the requirements change over time. This makes the applications difficult to develop and distribute. Service Oriented Architecture (SOA) based approaches such as the BPET (Business Process Execution Language) intend to provide a solution but fail to address the problems sufficiently, especially in the situations where the expectations and level of skills of the users (e.g. the participants of the processes) in different organisations are likely to be different. In this paper, we discuss a design pattern that provides a novel approach towards a solution. In the solution, business users can design the applications at a high level of abstraction: the use cases and user interactions; the designs are documented and used, together with the data and events captured later that represents the user interactions with the systems, to feed an intermediate component local to the users -the IFM (InterFace Mapper) -which bridges the gaps between the users and the systems. We discuss the main issues faced in the design and prototyping. The approach alleviates the need for re-programming with the APIs to any back-end service thus easing the development and distribution of the applications
Resumo:
Economic analysis of technology treats it as given exogenously, while determined endogenously. This paper examines the conceptual conflict. The paper outlines an alternative conceptual framework. This uses a 'General Vertical Division of Labour' into conceptual and executive parts to facilitate a coherent political economic explanation of technological change. The paper suggests that we may acquire rather than impose an understanding of technological change. It also suggests that we may re-define and reassess the efficiency of technological change, through the values inculcated into it.
Resumo:
This paper explores the changing role of contemporary grandparents with many demonstrating a willingness and ability to take on parental responsibilities for their grandchildren, where they may face challenges and opportunities in difficult times. Three main forms of grand parenting are identified in the literature, those who have primary responsibility and are raising their grand children as their main carers perhaps in response to crisis situations, those who live in extended families and participate in care, and those who provide day care while the child’s parents work. The latter has increased because of the increasing frequency of divorce, single parenting and the lack of available or subsidised child care in the United Kingdom. When grandparents step into a troubled situation and attempt to offer stability and security for their grandchildren they may have to manage the combined responsibilities of family caregivers and parental figures. Grandparenthood is a tenuous role, lacking clear agreement on behaviour norms. In the culture of advice and parenting support, while care must be taken not to undermine parenting skills or make judgements about the ability to cope with the demands of childcare, an exploration of the impact on grandparents and children must be undertaken. Due to the complex web of interrelated factors the process and outcomes of care giving by grandparents is not well known in the literature. It is proposed therefore that it is timely for research to be undertaken to explore and develop a theory of Grandparenthood.
Resumo:
Purpose: This paper seeks to investigate the factors influencing the business performance of estate agency in England and Wales. Design/methodology/approach: The paper investigates the effect of housing market, company size and pricing policy on business performance in the estate agency sector in England and Wales. The analysis uses the survey data of Woolwich Cost of Moving Survey (a survey of transactions costs sponsored by the Woolwich/Barclays Bank) from 2003 to 2005 to test the hypothesis that the business performance of estate agency is affected by industry characteristics and firm factors. Findings: The empirical analysis indicates that the business performance of estate agency is subject to market environment volatility such as market uncertainty, housing market liquidity and house price changes. The firm factors such as firm size and the level of agency fee have no explanatory power in explaining business performance. The level of agency fee is positively associated with firm size, market environment and liquidity. Research limitations/implications: The research is limited to the data received and is based on a research project on transaction costs designed prior to this analysis. Originality/value: There is little other research that investigates the factors determining the business performance of estate agency, using consecutive data of three years across England and Wales. The findings are useful for practitioners and/or managers to allocate resources and adjust their business strategy to enhance business performance in the estate agency sector.
Resumo:
This report provides a comparative analysis of the existing and emergent Business models currently employed in the Entertainment digital game industry (referred to in this report as the Leisure industry) and the “serious”, or in the context of the RAGE project and this report, the Applied Games industry. In conjunction with the accompanying WP 7.2 report providing a value chain analysis this report will inform the development of a business mode or models for the proposed RAGE ecosystem.
Resumo:
Aim Recent studies have suggested that global diatom distributions are not limited by dispersal, in the case of both extant species and fossil species, but rather that environmental filtering explains their spatial patterns. Hubbell's neutral theory of biodiversity provides a framework in which to test these alternatives. Our aim is to test whether the structure of marine phytoplankton (diatoms, dinoflagellates and coccolithophores) assemblages across the Atlantic agrees with neutral theory predictions. We asked: (1) whether intersite variance in phytoplankton diversity is explained predominantly by dispersal limitation or by environmental conditions; and (2) whether species abundance distributions are consistent with those expected by the neutral model. Location Meridional transect of the Atlantic (50 degrees N50 degrees S). Methods We estimated the relative contributions of environmental factors and geographic distance to phytoplankton composition using similarity matrices, Mantel tests and variation partitioning of the species composition based upon canonical ordination methods. We compared the species abundance distribution of phytoplankton with the neutral model using Etienne's maximum-likelihood inference method. Results Phytoplankton communities are slightly more determined by niche segregation (24%), than by dispersal limitation and ecological drift (17%). In 60% of communities, the assumption of neutrality in species' abundance distributions could not be rejected. In tropical zones, where oceanic gyres enclose large stable water masses, most communities showed low species immigration rates; in contrast, we infer that communities in temperate areas, out of oligotrophic gyres, have higher rates of species immigration. Conclusions Phytoplankton community structure is consistent with partial niche assembly and partial dispersal and drift assembly (neutral processes). The role of dispersal limitation is almost as important as habitat filtering, a fact that has been largely overlooked in previous studies. Furthermore, the polewards increase in immigration rates of species that we have discovered is probably caused by water mixing conditions and productivity.