846 resultados para information theory and computation
Resumo:
Using a transactions costs framework, we examine the impact of information and communication technologies (mobile phones and radios) use on market participation in developing country agricultural markets using a novel transaction-level data set of Ghanaian farmers. Our analysis of the choice of markets by farmers suggests that market information from a broader range of markets may not always induce farmers to sell in more distant markets; instead farmers may use broader market information to enhance their bargaining power in closer markets. Finally, we find weak evidence on the impact of using mobile phones in attracting farm gate buyers.
Resumo:
During the last 30 years, significant debate has taken place regarding multilevel research. However, the extent to which multilevel research is overtly practiced remains to be examined. This article analyzes 10 years of organizational research within a multilevel framework (from 2001 to 2011). The goals of this article are (a) to understand what has been done, during this decade, in the field of organizational multilevel research and (b) to suggest new arenas of research for the next decade. A total of 132 articles were selected for analysis through ISI Web of Knowledge. Through a broad-based literature review, results suggest that there is equilibrium between the amount of empirical and conceptual papers regarding multilevel research, with most studies addressing the cross-level dynamics between teams and individuals. In addition, this study also found that the time still has little presence in organizational multilevel research. Implications, limitations, and future directions are addressed in the end. Organizations are made of interacting layers. That is, between layers (such as divisions, departments, teams, and individuals) there is often some degree of interdependence that leads to bottom-up and top-down influence mechanisms. Teams and organizations are contexts for the development of individual cognitions, attitudes, and behaviors (top-down effects; Kozlowski & Klein, 2000). Conversely, individual cognitions, attitudes, and behaviors can also influence the functioning and outcomes of teams and organizations (bottom-up effects; Arrow, McGrath, & Berdahl, 2000). One example is when the rewards system of one organization may influence employees’ intention to quit and the existence or absence of extra role behaviors. At the same time, many studies have showed the importance of bottom-up emergent processes that yield higher level phenomena (Bashshur, Hernández, & González-Romá, 2011; Katz-Navon & Erez, 2005; Marques-Quinteiro, Curral, Passos, & Lewis, in press). For example, the affectivity of individual employees may influence their team’s interactions and outcomes (Costa, Passos, & Bakker, 2012). Several authors agree that organizations must be understood as multilevel systems, meaning that adopting a multilevel perspective is fundamental to understand real-world phenomena (Kozlowski & Klein, 2000). However, whether this agreement is reflected in practicing multilevel research seems to be less clear. In fact, how much is known about the quantity and quality of multilevel research done in the last decade? The aim of this study is to compare what has been proposed theoretically, concerning the importance of multilevel research, with what has really been empirically studied and published. First, this article outlines a review of the multilevel theory, followed by what has been theoretically “put forward” by researchers. Second, this article presents what has really been “practiced” based on the results of a review of multilevel studies published from 2001 to 2011 in business and management journals. Finally, some barriers and challenges to true multilevel research are suggested. This study contributes to multilevel research as it describes the last 10 years of research. It quantitatively depicts the type of articles being written, and where we can find the majority of the publications on empirical and conceptual work related to multilevel thinking.
Resumo:
This paper explores the social theories implicit in system dynamics (SD) practice. Groupings of SD practice are observed in different parts of a framework for studying social theories. Most are seen to be located within `functionalist sociology'. To account for the remainder, two new forms of practice are discussed, each related to a different paradigm. Three competing conclusions are then offered: 1. The implicit assumption that SD is grounded in functionalist sociology is correct and should be made explicit. 2. Forrester's ideas operate at the level of method not social theory so SD, though not wedded to a particular social theoretic paradigm, can be re-crafted for use within different paradigms. 3. SD is consistent with social theories which dissolve the individual/society divide by taking a dialectical, or feedback, stance. It can therefore bring a formal modelling approach to the `agency/structure' debate within social theory and so bring SD into the heart of social science. The last conclusion is strongly recommended.
Resumo:
Stochastic methods are a crucial area in contemporary climate research and are increasingly being used in comprehensive weather and climate prediction models as well as reduced order climate models. Stochastic methods are used as subgrid-scale parameterizations (SSPs) as well as for model error representation, uncertainty quantification, data assimilation, and ensemble prediction. The need to use stochastic approaches in weather and climate models arises because we still cannot resolve all necessary processes and scales in comprehensive numerical weather and climate prediction models. In many practical applications one is mainly interested in the largest and potentially predictable scales and not necessarily in the small and fast scales. For instance, reduced order models can simulate and predict large-scale modes. Statistical mechanics and dynamical systems theory suggest that in reduced order models the impact of unresolved degrees of freedom can be represented by suitable combinations of deterministic and stochastic components and non-Markovian (memory) terms. Stochastic approaches in numerical weather and climate prediction models also lead to the reduction of model biases. Hence, there is a clear need for systematic stochastic approaches in weather and climate modeling. In this review, we present evidence for stochastic effects in laboratory experiments. Then we provide an overview of stochastic climate theory from an applied mathematics perspective. We also survey the current use of stochastic methods in comprehensive weather and climate prediction models and show that stochastic parameterizations have the potential to remedy many of the current biases in these comprehensive models.
Resumo:
We analyze of ion populations observed by the NOAA-12 satellite within dayside auroral transients. The data are matched with an open magnetopause model which allows for the transmission of magnetosheath ions across one or both of the two Alfvén waves which emanate from the magnetopause reconnection site. It also allows for reflection and acceleration of ions of magnetospheric origin by these waves. From the good agreement found between the model and the observations, we propose that the events and the low-latitude boundary precipitation are both on open field lines.
Resumo:
We evaluate a number of real estate sentiment indices to ascertain current and forward-looking information content that may be useful for forecasting demand and supply activities. Analyzing the dynamic relationships within a Vector Auto-Regression (VAR) framework and using the quarterly US data over 1988-2010, we test the efficacy of several sentiment measures by comparing them with other coincident economic indicators. Overall, our analysis suggests that the sentiment in real estate convey valuable information that can help predict changes in real estate returns. These findings have important implications for investment decisions, from consumers' as well as institutional investors' perspectives.
Resumo:
We study the scaling properties and Kraichnan–Leith–Batchelor (KLB) theory of forced inverse cascades in generalized two-dimensional (2D) fluids (α-turbulence models) simulated at resolution 8192x8192. We consider α=1 (surface quasigeostrophic flow), α=2 (2D Euler flow) and α=3. The forcing scale is well resolved, a direct cascade is present and there is no large-scale dissipation. Coherent vortices spanning a range of sizes, most larger than the forcing scale, are present for both α=1 and α=2. The active scalar field for α=3 contains comparatively few and small vortices. The energy spectral slopes in the inverse cascade are steeper than the KLB prediction −(7−α)/3 in all three systems. Since we stop the simulations well before the cascades have reached the domain scale, vortex formation and spectral steepening are not due to condensation effects; nor are they caused by large-scale dissipation, which is absent. One- and two-point p.d.f.s, hyperflatness factors and structure functions indicate that the inverse cascades are intermittent and non-Gaussian over much of the inertial range for α=1 and α=2, while the α=3 inverse cascade is much closer to Gaussian and non-intermittent. For α=3 the steep spectrum is close to that associated with enstrophy equipartition. Continuous wavelet analysis shows approximate KLB scaling ℰ(k)∝k−2 (α=1) and ℰ(k)∝k−5/3 (α=2) in the interstitial regions between the coherent vortices. Our results demonstrate that coherent vortex formation (α=1 and α=2) and non-realizability (α=3) cause 2D inverse cascades to deviate from the KLB predictions, but that the flow between the vortices exhibits KLB scaling and non-intermittent statistics for α=1 and α=2.
Resumo:
In this paper we employ a hypothetical discrete choice experiment (DCE) to examine how much consumers are willing to pay to use technology to customize their food shopping. We conjecture that customized information provision can aid in the composition of a healthier shop. Our results reveal that consumers are prepared to pay relatively more for individual specic information as opposed to generic nutritional information that is typically provided on food labels. In arriving at these results we have examined various model specications including those that make use of ex-post de-brieng questions on attribute nonattendance and attribute ranking information and those that consider the time taken to complete the survey. Our main results are robust to the various model specications we examine
Resumo:
This edited volume argues that even in recent Critical Disability Studies which have sought to critique essentialist assumptions in relation to Disability, nevertheless essentialisms remain which predetermine and predirect definitions and arguments in the field. This volume analyses such essentialisms in a wide range of areas such as childhood, gender, sexuality, reproduction, ADHD, autism, the animal, d/Deafness, hirsutism, the body, and vision. Particularly issues such as 'agency', 'voice' and 'body' are explored in terms of their political implications.
Resumo:
People are often exposed to more information than they can actually remember. Despite this frequent form of information overload, little is known about how much information people choose to remember. Using a novel “stop” paradigm, the current research examined whether and how people choose to stop receiving new—possibly overwhelming—information with the intent to maximize memory performance. Participants were presented with a long list of items and were rewarded for the number of correctly remembered words in a following free recall test. Critically, participants in a stop condition were provided with the option to stop the presentation of the remaining words at any time during the list, whereas participants in a control condition were presented with all items. Across five experiments, we found that participants tended to stop the presentation of the items to maximize the number of recalled items, but this decision ironically led to decreased memory performance relative to the control group. This pattern was consistent even after controlling for possible confounding factors (e.g., task demands). The results indicated a general, false belief that we can remember a larger number of items if we restrict the quantity of learning materials. These findings suggest people have an incomplete understanding of how we remember excessive amounts of information.