955 resultados para Set theory.
Resumo:
A brief introduction into the theory of differential inclusions, viability theory and selections of set valued mappings is presented. As an application the implicit scheme of the Leontief dynamic input-output model is considered.
Resumo:
János Kornai’s DRSE theory (Kornai, 2014) follows the ex post model philosophy which radically rejects the ex ante set of conditions laid down by the dominant neoclassical school and the stringent limits of equilibrium, and defines its own premises for the functioning of capitalist economy. In other words, the DRSE theory represents an extremely novel trend among the various schools of economics. The theory is still only a verbal model with the following supporting pillars as the immanent features of the capitalist system: dynamism, rivalry and the surplus economy. (The English name of the theory uses the initial letters of the terms Dynamism, Rivalry, Surplus Economy). The dominance of the surplus economy, that is, oversupply is replaced by monopolistic competition, uncertainty over the volume of demand, Schumpeterian innovation, dynamism, technological progress, creative destruction and increasing return to scale with rivalry between producers and service providers for markets. This paper aims to examine whether the DRSE theory can be formulated as a formal mathematical model. We have chosen a special route to do this: first we explore the unreal ex ante assumptions of general equilibrium theory (Walras, 1874; Neumann, 1945), and then we establish some of the possible connections between the premises of DRSE, which include the crucial condition that just like in biological evolution, there is no fixed steady state in the evolutionary processes of market economy, not even as a point of reference. General equilibrium theory and DRSE theory are compared in the focus of Schumpeterian evolutionary economics.
Resumo:
An abundance of research in the social sciences has demonstrated a persistent bias against nonnative English speakers (Giles & Billings, 2004; Gluszek & Dovidio, 2010). Yet, organizational scholars have only begun to investigate the underlying mechanisms that drive the bias against nonnative speakers and subsequently design interventions to mitigate these biases. In this dissertation, I offer an integrative model to organize past explanations for accent-based bias into a coherent framework, and posit that nonnative accents elicit social perceptions that have implications at the personal, relational, and group level. I also seek to complement the existing emphasis on main effects of accents, which focuses on the general tendency to discriminate against those with accents, by examining moderators that shed light on the conditions under which accent-based bias is most likely to occur. Specifically, I explore the idea that people’s beliefs about the controllability of accents can moderate their evaluations toward nonnative speakers, such that those who believe that accents can be controlled are more likely to demonstrate a bias against nonnative speakers. I empirically test my theoretical model in three studies in the context of entrepreneurial funding decisions. Results generally supported the proposed model. By examining the micro foundations of accent-based bias, the ideas explored in this dissertation set the stage for future research in an increasingly multilingual world.
Resumo:
Major developments in the technological environment can become commonplace very quickly. They are now impacting upon a broad range of information-based service sectors, as high growth Internet-based firms, such as Google, Amazon, Facebook and Airbnb, and financial technology (Fintech) start-ups expand their product portfolios into new markets. Real estate is one of the information-based service sectors that is currently being impacted by this new type of competitor and the broad range of disruptive digital technologies that have emerged. Due to the vast troves of data that these Internet firms have at their disposal and their asset-light (cloud-based) structures, they are able to offer highly-targeted products at much lower costs than conventional brick-and-mortar companies.
Resumo:
Value and reasons for action are often cited by rationalists and moral realists as providing a desire-independent foundation for normativity. Those maintaining instead that normativity is dependent upon motivation often deny that anything called '"value" or "reasons" exists. According to the interest-relational theory, something has value relative to some perspective of desire just in case it satisfies those desires, and a consideration is a reason for some action just in case it indicates that something of value will be accomplished by that action. Value judgements therefore describe real properties of objects and actions, but have no normative significance independent of desires. It is argued that only the interest-relational theory can account for the practical significance of value and reasons for action. Against the Kantian hypothesis of prescriptive rational norms, I attack the alleged instrumental norm or hypothetical imperative, showing that the normative force for taking the means to our ends is explicable in terms of our desire for the end, and not as a command of reason. This analysis also provides a solution to the puzzle concerning the connection between value judgement and motivation. While it is possible to hold value judgements without motivation, the connection is more than accidental. This is because value judgements are usually but not always made from the perspective of desires that actually motivate the speaker. In the normal case judgement entails motivation. But often we conversationally borrow external perspectives of desire, and subsequent judgements do not entail motivation. This analysis drives a critique of a common practice as a misuse of normative language. The "absolutist" attempts to use and, as philosopher, analyze normative language in such a way as to justify the imposition of certain interests over others. But these uses and analyses are incoherent - in denying relativity to particular desires they conflict with the actual meaning of these utterances, which is always indexed to some particular set of desires.
Resumo:
The thesis presents experimental results, simulations, and theory on turbulence excited in magnetized plasmas near the ionosphere’s upper hybrid layer. The results include: The first experimental observations of super small striations (SSS) excited by the High-Frequency Auroral Research Project (HAARP) The first detection of high-frequency (HF) waves from the HAARP transmitter over a distance of 16x10^3 km The first simulations indicating that upper hybrid (UH) turbulence excites electron Bernstein waves associated with all nearby gyroharmonics Simulation results that indicate that the resulting bulk electron heating near the upper hybrid (UH) resonance is caused primarily by electron Bernstein waves parametrically excited near the first gyroharmonic. On the experimental side we present two sets of experiments performed at the HAARP heating facility in Alaska. In the first set of experiments, we present the first detection of super-small (cm scale) striations (SSS) at the HAARP facility. We detected density structures smaller than 30 cm for the first time through a combination of satellite and ground based measurements. In the second set of experiments, we present the results of a novel diagnostic implemented by the Ukrainian Antarctic Station (UAS) in Verdansky. The technique allowed the detection of the HAARP signal at a distance of nearly 16 Mm, and established that the HAARP signal was injected into the ionospheric waveguide by direct scattering off of dekameter-scale density structures induced by the heater. On the theoretical side, we present results of Vlasov simulations near the upper hybrid layer. These results are consistent with the bulk heating required by previous work on the theory of the formation of descending artificial ionospheric layers (DIALs), and with the new observations of DIALs at HAARP’s upgraded effective radiated power (ERP). The simulations that frequency sweeps, and demonstrate that the heating changes from a bulk heating between gyroharmonics, to a tail acceleration as the pump frequency is swept through the fourth gyroharmonic. These simulations are in good agreement with experiments. We also incorporate test particle simulations that isolate the effects of specific wave modes on heating, and we find important contributions from both electron Bernstein waves and upper hybrid waves, the former of which have not yet been detected by experiments, and have not been previously explored as a driver of heating. In presenting these results, we analyzed data from HAARP diagnostics and assisted in planning the second round of experiments. We integrated the data into a picture of experiments that demonstrated the detection of SSS, hysteresis effects in simulated electromagnetic emission (SEE) features, and the direct scattering of the HF pump into the ionospheric waveguide. We performed simulations and analyzed simulation data to build the understanding of collisionless heating near the upper hybrid layer, and we used these simulations to show that bulk electron heating at the upper hybrid layer is possible, which is required by current theories of DAIL formation. We wrote a test particle simulation to isolate the effects of electron Bernstein waves and upper hybrid layers on collisionless heating, and integrated this code to work with both the output of Vlasov simulations and the input for simulations of DAIL formation.
Resumo:
In this paper we use concepts from graph theory and cellular biology represented as ontologies, to carry out semantic mining tasks on signaling pathway networks. Specifically, the paper describes the semantic enrichment of signaling pathway networks. A cell signaling network describes the basic cellular activities and their interactions. The main contribution of this paper is in the signaling pathway research area, it proposes a new technique to analyze and understand how changes in these networks may affect the transmission and flow of information, which produce diseases such as cancer and diabetes. Our approach is based on three concepts from graph theory (modularity, clustering and centrality) frequently used on social networks analysis. Our approach consists into two phases: the first uses the graph theory concepts to determine the cellular groups in the network, which we will call them communities; the second uses ontologies for the semantic enrichment of the cellular communities. The measures used from the graph theory allow us to determine the set of cells that are close (for example, in a disease), and the main cells in each community. We analyze our approach in two cases: TGF-β and the Alzheimer Disease.
Resumo:
We study the relations of shift equivalence and strong shift equivalence for matrices over a ring $\mathcal{R}$, and establish a connection between these relations and algebraic K-theory. We utilize this connection to obtain results in two areas where the shift and strong shift equivalence relations play an important role: the study of finite group extensions of shifts of finite type, and the Generalized Spectral Conjectures of Boyle and Handelman for nonnegative matrices over subrings of the real numbers. We show the refinement of the shift equivalence class of a matrix $A$ over a ring $\mathcal{R}$ by strong shift equivalence classes over the ring is classified by a quotient $NK_{1}(\mathcal{R}) / E(A,\mathcal{R})$ of the algebraic K-group $NK_{1}(\calR)$. We use the K-theory of non-commutative localizations to show that in certain cases the subgroup $E(A,\mathcal{R})$ must vanish, including the case $A$ is invertible over $\mathcal{R}$. We use the K-theory connection to clarify the structure of algebraic invariants for finite group extensions of shifts of finite type. In particular, we give a strong negative answer to a question of Parry, who asked whether the dynamical zeta function determines up to finitely many topological conjugacy classes the extensions by $G$ of a fixed mixing shift of finite type. We apply the K-theory connection to prove the equivalence of a strong and weak form of the Generalized Spectral Conjecture of Boyle and Handelman for primitive matrices over subrings of $\mathbb{R}$. We construct explicit matrices whose class in the algebraic K-group $NK_{1}(\mathcal{R})$ is non-zero for certain rings $\mathcal{R}$ motivated by applications. We study the possible dynamics of the restriction of a homeomorphism of a compact manifold to an isolated zero-dimensional set. We prove that for $n \ge 3$ every compact zero-dimensional system can arise as an isolated invariant set for a homeomorphism of a compact $n$-manifold. In dimension two, we provide obstructions and examples.
Resumo:
The analysis of fluid behavior in multiphase flow is very relevant to guarantee system safety. The use of equipment to describe such behavior is subjected to factors such as the high level of investments and of specialized labor. The application of image processing techniques to flow analysis can be a good alternative, however, very little research has been developed. In this subject, this study aims at developing a new approach to image segmentation based on Level Set method that connects the active contours and prior knowledge. In order to do that, a model shape of the targeted object is trained and defined through a model of point distribution and later this model is inserted as one of the extension velocity functions for the curve evolution at zero level of level set method. The proposed approach creates a framework that consists in three terms of energy and an extension velocity function λLg(θ)+vAg(θ)+muP(0)+θf. The first three terms of the equation are the same ones introduced in (LI CHENYANG XU; FOX, 2005) and the last part of the equation θf is based on the representation of object shape proposed in this work. Two method variations are used: one restricted (Restrict Level Set - RLS) and the other with no restriction (Free Level Set - FLS). The first one is used in image segmentation that contains targets with little variation in shape and pose. The second will be used to correctly identify the shape of the bubbles in the liquid gas two phase flows. The efficiency and robustness of the approach RLS and FLS are presented in the images of the liquid gas two phase flows and in the image dataset HTZ (FERRARI et al., 2009). The results confirm the good performance of the proposed algorithm (RLS and FLS) and indicate that the approach may be used as an efficient method to validate and/or calibrate the various existing equipment used as meters for two phase flow properties, as well as in other image segmentation problems.
Resumo:
n this paper we deal with the problem of obtaining the set of k-additive measures dominating a fuzzy measure. This problem extends the problem of deriving the set of probabilities dominating a fuzzy measure, an important problem appearing in Decision Making and Game Theory. The solution proposed in the paper follows the line developed by Chateauneuf and Jaffray for dominating probabilities and continued by Miranda et al. for dominating k-additive belief functions. Here, we address the general case transforming the problem into a similar one such that the involved set functions have non-negative Möbius transform; this simplifies the problem and allows a result similar to the one developed for belief functions. Although the set obtained is very large, we show that the conditions cannot be sharpened. On the other hand, we also show that it is possible to define a more restrictive subset, providing a more natural extension of the result for probabilities, such that it is possible to derive any k-additive dominating measure from it.
Resumo:
In the past few years, there has been a concern among economists and policy makers that increased openness to international trade affects some regions in a country more than others. Recent research has found that local labor markets more exposed to import competition through their initial employment composition experience worse outcomes in several dimensions such as, employment, wages, and poverty. Although there is evidence that regions within a country exhibit variation in the intensity with which they trade with each other and with other countries, trade linkages have been ignored in empirical analyses of the regional effects of trade, which focus on differences in employment composition. In this dissertation, I investigate how local labor markets' trade linkages shape the response of wages to international trade shocks. In the second chapter, I lay out a standard multi-sector general equilibrium model of trade, where domestic regions trade with each other and with the rest of the world. Using this benchmark, I decompose a region's wage change resulting from a national import cost shock into a direct effect on prices, holding other endogenous variables constant, and a series of general equilibrium effects. I argue the direct effect provides a natural measure of exposure to import competition within the model since it summarizes the effect of the shock on a region's wage as a function of initial conditions given by its trade linkages. I call my proposed measure linkage exposure while I refer to the measures used in previous studies as employment exposure. My theoretical analysis also shows that the assumptions previous studies make on trade linkages are not consistent with the standard trade model. In the third chapter, I calibrate the model to the Brazilian economy in 1991--at the beginning of a period of trade liberalization--to perform a series of experiments. In each of them, I reduce the Brazilian import cost by 1 percent in a single sector and I calculate how much of the cross-regional variation in counterfactual wage changes is explained by exposure measures. Over this set of experiments, employment exposure explains, for the median sector, 2 percent of the variation in counterfactual wage changes while linkage exposure explains 44 percent. In addition, I propose an estimation strategy that incorporates trade linkages in the analysis of the effects of trade on observed wages. In the model, changes in wages are completely determined by changes in market access, an endogenous variable that summarizes the real demand faced by a region. I show that a linkage measure of exposure is a valid instrument for changes in market access within Brazil. By using observed wage changes in Brazil between 1991-2000, my estimates imply that a region at the 25th percentile of the change in domestic market access induced by trade liberalization, experiences a 0.6 log points larger wage decline (or smaller wage increase) than a region at the 75th percentile. The estimates from a regression of wages changes on exposure imply that a region at the 25th percentile of exposure experiences a 3 log points larger wage decline (or smaller wage increase) than a region at the 75th percentile. I conclude that estimates based on exposure overstate the negative impact of trade liberalization on wages in Brazil. In the fourth chapter, I extend the standard model to allow for two types of workers according to their education levels: skilled and unskilled. I show that there is substantial variation across Brazilian regions in the skill premium. I use the exogenous variation provided by tariff changes to estimate the impact of market access on the skill premium. I find that decreased domestic market access resulting from trade liberalization resulted in a higher skill premium. I propose a mechanism to explain this result: that the manufacturing sector is relatively more intensive in unskilled labor and I show empirical evidence that supports this hypothesis.
Resumo:
Schurz and Tholen (2016) argue that common approaches to studying the neural basis of “theory of mind” (ToM) obscure a potentially important role for inferior frontal gyrus (IFG) in managing conflict between perspectives, and urge new work to address this question: “to gain a full understanding of the IFG's role in ToM, we encourage future imaging studies to use a wider range of control conditions.” (p332). We wholeheartedly agree, but note that this observation has been made before, and has already led to a programme of work that provides evidence from fMRI, EEG, and TMS on the role of IFG in managing conflict between self and other perspectives in ToM. We highlight these works, and in particular we demonstrate how careful manipulation within ToM tasks has been used to act as an internal control condition, wherein conflict has been manipulated within-subject. We further add to the discussion by framing key questions that remain regarding IFG in the context of these. Using limitations in the existing research, we outline how best researchers can proceed with the challenge set by Schurz and Tholen (2016).
Resumo:
Fifty years have passed since Cyert and March’s 1963 A Behavioral Theory of the Firm. During this time, BTOF has been adopted across different research domains to investigate how organizations set goals, how they determine aspirations and how they finally react to performance aspiration discrepancies. Cyert and March’s framework has also recently emerged as one of the dominant paradigms to understand the ways in which family business organizations make decisions. In this chapter, I review the theoretical development and empirical results of BTOF and its application in the family business field of study in order to identify theoretical and empirical gaps and propose suggestions for future research. The conclusions suggest that BTOF is both a theoretically and empirically valid perspective in family business research, particularly when combined with other theoretical frameworks. The principal recommendation is to apply behavioral theory to enhance scholarly understanding of how family organisations define their aspiration levels and respond to organizational problems.
Resumo:
Data sources are often dispersed geographically in real life applications. Finding a knowledge model may require to join all the data sources and to run a machine learning algorithm on the joint set. We present an alternative based on a Multi Agent System (MAS): an agent mines one data source in order to extract a local theory (knowledge model) and then merges it with the previous MAS theory using a knowledge fusion technique. This way, we obtain a global theory that summarizes the distributed knowledge without spending resources and time in joining data sources. New experiments have been executed including statistical significance analysis. The results show that, as a result of knowledge fusion, the accuracy of initial theories is significantly improved as well as the accuracy of the monolithic solution.
Resumo:
This paper outlines a formal and systematic approach to explication of the role of structure in information organization. It presents a preliminary set of constructs that are useful for understanding the similarities and differences that obtain across information organization systems. This work seeks to provide necessary groundwork for development of a theory of structure that can serve as a lens through which to observe patterns across systems of information organization.