136 resultados para Organizational Complexity
Resumo:
Electing a leader is a fundamental task in distributed computing. In its implicit version, only the leader must know who is the elected leader. This paper focuses on studying the message and time complexity of randomized implicit leader election in synchronous distributed networks. Surprisingly, the most "obvious" complexity bounds have not been proven for randomized algorithms. The "obvious" lower bounds of O(m) messages (m is the number of edges in the network) and O(D) time (D is the network diameter) are non-trivial to show for randomized (Monte Carlo) algorithms. (Recent results that show that even O(n) (n is the number of nodes in the network) is not a lower bound on the messages in complete networks, make the above bounds somewhat less obvious). To the best of our knowledge, these basic lower bounds have not been established even for deterministic algorithms (except for the limited case of comparison algorithms, where it was also required that some nodes may not wake up spontaneously, and that D and n were not known).
We establish these fundamental lower bounds in this paper for the general case, even for randomized Monte Carlo algorithms. Our lower bounds are universal in the sense that they hold for all universal algorithms (such algorithms should work for all graphs), apply to every D, m, and n, and hold even if D, m, and n are known, all the nodes wake up simultaneously, and the algorithms can make anyuse of node's identities. To show that these bounds are tight, we present an O(m) messages algorithm. An O(D) time algorithm is known. A slight adaptation of our lower bound technique gives rise to an O(m) message lower bound for randomized broadcast algorithms.
An interesting fundamental problem is whether both upper bounds (messages and time) can be reached simultaneously in the randomized setting for all graphs. (The answer is known to be negative in the deterministic setting). We answer this problem partially by presenting a randomized algorithm that matches both complexities in some cases. This already separates (for some cases) randomized algorithms from deterministic ones. As first steps towards the general case, we present several universal leader election algorithms with bounds that trade-off messages versus time. We view our results as a step towards understanding the complexity of universal leader election in distributed networks.
Resumo:
High-dimensional gene expression data provide a rich source of information because they capture the expression level of genes in dynamic states that reflect the biological functioning of a cell. For this reason, such data are suitable to reveal systems related properties inside a cell, e.g., in order to elucidate molecular mechanisms of complex diseases like breast or prostate cancer. However, this is not only strongly dependent on the sample size and the correlation structure of a data set, but also on the statistical hypotheses tested. Many different approaches have been developed over the years to analyze gene expression data to (I) identify changes in single genes, (II) identify changes in gene sets or pathways, and (III) identify changes in the correlation structure in pathways. In this paper, we review statistical methods for all three types of approaches, including subtypes, in the context of cancer data and provide links to software implementations and tools and address also the general problem of multiple hypotheses testing. Further, we provide recommendations for the selection of such analysis methods.
Resumo:
The Marine Strategy Framework Directive (MSFD) requires that European Union Member States achieve "Good Environmental Status" (GES) in respect of 11 Descriptors of the marine environment by 2020. Of those, Descriptor 4, which focuses on marine food webs, is perhaps the most challenging to implement since the identification of simple indicators able to assess the health of highly dynamic and complex interactions is difficult. Here, we present the proposed food web criteria/indicators and analyse their theoretical background and applicability in order to highlight both the current knowledge gaps and the difficulties associated with the assessment of GES. We conclude that the existing suite of indicators gives variable focus to the three important food web properties: structure, functioning and dynamics, and more emphasis should be given to the latter two and the general principles that relate these three properties. The development of food web indicators should be directed towards more integrative and process-based indicators with an emphasis on their responsiveness to multiple anthropogenic pressures. (C) 2013 Elsevier Ltd. All rights reserved.
Resumo:
Many modern networks are \emph{reconfigurable}, in the sense that the topology of the network can be changed by the nodes in the network. For example, peer-to-peer, wireless and ad-hoc networks are reconfigurable. More generally, many social networks, such as a company's organizational chart; infrastructure networks, such as an airline's transportation network; and biological networks, such as the human brain, are also reconfigurable. Modern reconfigurable networks have a complexity unprecedented in the history of engineering, resembling more a dynamic and evolving living animal rather than a structure of steel designed from a blueprint. Unfortunately, our mathematical and algorithmic tools have not yet developed enough to handle this complexity and fully exploit the flexibility of these networks. We believe that it is no longer possible to build networks that are scalable and never have node failures. Instead, these networks should be able to admit small, and maybe, periodic failures and still recover like skin heals from a cut. This process, where the network can recover itself by maintaining key invariants in response to attack by a powerful adversary is what we call \emph{self-healing}. Here, we present several fast and provably good distributed algorithms for self-healing in reconfigurable dynamic networks. Each of these algorithms have different properties, a different set of gaurantees and limitations. We also discuss future directions and theoretical questions we would like to answer. %in the final dissertation that this document is proposed to lead to.
Resumo:
While organizational ethnographers have embraced the concept of self-reflexivity, problems remain. In this article we argue that the prevalent assumption that self-reflexivity is the sole responsibility of the individual researcher limits its scope for understanding organizations. To address this, we propose an innovative method of collective reflection that is inspired by ideas from cultural and feminist anthropology. The value of this method is illustrated through an analysis of two ethnographic case studies, involving a ‘pair interview’ method. This collective approach surfaced self-reflexive accounts, in which aspects of the research encounter that still tend to be downplayed within organizational ethnographies, including emotion, intersubjectivity and the operation of power dynamics, were allowed to emerge. The approach also facilitated a second contribution through the conceptualization of organizational ethnography as a unique endeavour that represents a collision between one ‘world of work’: the university, with a second: the researched organization. We find that this ‘collision’ exacerbates the emotionality of ethnographic research, highlighting the refusal of ‘researched’ organizations to be domesticated by the specific norms of academia. Our article concludes by drawing out implications for the practice of self-reflexivity within organizational ethnography.
Resumo:
Concern for NGO accountability has been intensified in recent years, following the growth in the size of NGOs and their power to influence global politics and curb the excesses of globalization. Questions have been raised about where the sector embraces the same standards of accountability that it demands from government and business. The objective of this paper is to examine one aspect of NGO accountability, its discharge through annual reporting. Using Habermas’ (1984; 1987) theory of communicative action, and specifically its validity claims, the research investigates whether NGOs use their annual reporting process to account to the host societies in which they operate or steer stakeholder actions toward their own self-interests. The results of the study indicate that efforts by organizations to account are characterized by communicative action through the provision of truthful disclosures, generally appropriate to the discharge of accountability and in a manner intended to improve their understandability. At the same time, however, some organizations exhibit strategically oriented behaviors in which the disclosure content is guided by the opportunity to present organizations in a particular light and there appears a lack of rhetor authenticity. The latter findings cast doubt on the ethical inspiration of NGOs and the values they demand from business communities, and questions arise as to why such practices exist and what lessons can be learnt from them.
Resumo:
This paper examines the applicability of an immersive virtual reality (VR) system to the process of organizational learning in a manufacturing context. The work focuses on the extent to which realism has to be represented in a simulated product build scenario in order to give the user an effective learning experience for an assembly task. Current technologies allow the visualization and manipulation of objects in VR systems but physical behaviors such as contact between objects and the effects of gravity are not commonly represented in off the shelf simulation solutions and the computational power required to facilitate these functions remains a challenge. This work demonstrates how physical behaviors can be coded and represented through the development of more effective mechanisms for the computer aided design (CAD) and VR interface.
Resumo:
Polar codes are one of the most recent advancements in coding theory and they have attracted significant interest. While they are provably capacity achieving over various channels, they have seen limited practical applications. Unfortunately, the successive nature of successive cancellation based decoders hinders fine-grained adaptation of the decoding complexity to design constraints and operating conditions. In this paper, we propose a systematic method for enabling complexity-performance trade-offs by constructing polar codes based on an optimization problem which minimizes the complexity under a suitably defined mutual information based performance constraint. Moreover, a low-complexity greedy algorithm is proposed in order to solve the optimization problem efficiently for very large code lengths.
Resumo:
In this paper, a low complexity system for spectral analysis of heart rate variability (HRV) is presented. The main idea of the proposed approach is the implementation of the Fast-Lomb periodogram that is a ubiquitous tool in spectral analysis, using a wavelet based Fast Fourier transform. Interestingly we show that the proposed approach enables the classification of processed data into more and less significant based on their contribution to output quality. Based on such a classification a percentage of less-significant data is being pruned leading to a significant reduction of algorithmic complexity with minimal quality degradation. Indeed, our results indicate that the proposed system can achieve up-to 45% reduction in number of computations with only 4.9% average error in the output quality compared to a conventional FFT based HRV system.