862 resultados para Complexity of Relations
Resumo:
Innate immune responses play a central role in neuroprotection and neurotoxicity during inflammatory processes that are triggered by pathogen-associated molecular pattern-exhibiting agents such as bacterial lipopolysaccharide (LPS) and that are modulated by inflammatory cytokines such as interferon γ (IFNγ). Recent findings describing the unexpected complexity of mammalian genomes and transcriptomes have stimulated further identification of novel transcripts involved in specific physiological and pathological processes, such as the neural innate immune response that alters the expression of many genes. We developed a system for efficient subtractive cloning that employs both sense and antisense cRNA drivers, and coupled it with in-house cDNA microarray analysis. This system enabled effective direct cloning of differentially expressed transcripts, from a small amount (0.5 µg) of total RNA. We applied this system to isolation of genes activated by LPS and IFNγ in primary-cultured cortical cells that were derived from newborn mice, to investigate the mechanisms involved in neuroprotection and neurotoxicity in maternal/perinatal infections that cause various brain injuries including periventricular leukomalacia. A number of genes involved in the immune and inflammatory response were identified, showing that neonatal neuronal/glial cells are highly responsive to LPS and IFNγ. Subsequent RNA blot analysis revealed that the identified genes were activated by LPS and IFNγ in a cooperative or distinctive manner, thereby supporting the notion that these bacterial and cellular inflammatory mediators can affect the brain through direct but complicated pathways. We also identified several novel clones of apparently non-coding RNAs that potentially harbor various regulatory functions. Characterization of the presently identified genes will give insights into mechanisms and interventions not only for perinatal infection-induced brain damage, but also for many other innate immunity-related brain disorders.
Resumo:
Until recently, much of the discussion regarding the type of organization theory needed in management studies focused on normative vs. descriptive roles of management science. Some authors however noticed that even a descriptive theory can have a normative impact. Among others, management theories are used by practitioners to make sense of their identity and roles in given contexts, and so guide their attitude, decision process, and behavior. The sensemaking potential of a theory might in this view represent an important element for predicting the adoption of a theory by practitioners. Accordingly, theories are needed which better grasp the increased complexity of today's business environment in order to be more relevant for practitioners. This article proposes a multi-faceted perspective of organizations. This implies leaving a simplistic view of organizations and building a 'cubist' conception. Picasso's cubism paintings are characterized by the use of multiple perspectives within a single drawing. Similarly, I argue here that managers must learn not only to add multiple responsibilities in their work, but to develop an integrated conception of their managerial identity and of their organizations in which the multiple social and economic dimensions are enmeshed. Social entrepreneurship is discussed as illustration of typical multi-faceted business.
Resumo:
Defining an efficient training set is one of the most delicate phases for the success of remote sensing image classification routines. The complexity of the problem, the limited temporal and financial resources, as well as the high intraclass variance can make an algorithm fail if it is trained with a suboptimal dataset. Active learning aims at building efficient training sets by iteratively improving the model performance through sampling. A user-defined heuristic ranks the unlabeled pixels according to a function of the uncertainty of their class membership and then the user is asked to provide labels for the most uncertain pixels. This paper reviews and tests the main families of active learning algorithms: committee, large margin, and posterior probability-based. For each of them, the most recent advances in the remote sensing community are discussed and some heuristics are detailed and tested. Several challenging remote sensing scenarios are considered, including very high spatial resolution and hyperspectral image classification. Finally, guidelines for choosing the good architecture are provided for new and/or unexperienced user.
Resumo:
The passive transfer of monoclonal antibodies, direct vaccination and in vitro assays have all shown that antigens associated with the tegumental membranes of Schistosoma mansoni are capable of mediating protective immune responses against the parasite in animal models. Furthermore, the principal antigens are highly antigenic during natural infection in man and stimulate strong humoral and cellular responses although, at present, their role in mediating protective immune responses in man remains equivocal. This presentation will review the current state of knowledge of the structure and expression of the major antigenic tegumental proteins of the schistosome and will attempt to relate the relevance of their structural features to possible function both in terms of protective immunity and parasite's ability to survive within the definitive host. A focus will be recent advances that have been made in the identification of means of anchoring of the antigenic proteins to the tegumental membrane. In addition, the implications of the structural complexity of the tegumental proteins in terms of their possible utility in vaccination and diagnosis will be considered.
Resumo:
BACKGROUND: Adequate pain assessment is critical for evaluating the efficacy of analgesic treatment in clinical practice and during the development of new therapies. Yet the currently used scores of global pain intensity fail to reflect the diversity of pain manifestations and the complexity of underlying biological mechanisms. We have developed a tool for a standardized assessment of pain-related symptoms and signs that differentiates pain phenotypes independent of etiology. METHODS AND FINDINGS: Using a structured interview (16 questions) and a standardized bedside examination (23 tests), we prospectively assessed symptoms and signs in 130 patients with peripheral neuropathic pain caused by diabetic polyneuropathy, postherpetic neuralgia, or radicular low back pain (LBP), and in 57 patients with non-neuropathic (axial) LBP. A hierarchical cluster analysis revealed distinct association patterns of symptoms and signs (pain subtypes) that characterized six subgroups of patients with neuropathic pain and two subgroups of patients with non-neuropathic pain. Using a classification tree analysis, we identified the most discriminatory assessment items for the identification of pain subtypes. We combined these six interview questions and ten physical tests in a pain assessment tool that we named Standardized Evaluation of Pain (StEP). We validated StEP for the distinction between radicular and axial LBP in an independent group of 137 patients. StEP identified patients with radicular pain with high sensitivity (92%; 95% confidence interval [CI] 83%-97%) and specificity (97%; 95% CI 89%-100%). The diagnostic accuracy of StEP exceeded that of a dedicated screening tool for neuropathic pain and spinal magnetic resonance imaging. In addition, we were able to reproduce subtypes of radicular and axial LBP, underscoring the utility of StEP for discerning distinct constellations of symptoms and signs. CONCLUSIONS: We present a novel method of identifying pain subtypes that we believe reflect underlying pain mechanisms. We demonstrate that this new approach to pain assessment helps separate radicular from axial back pain. Beyond diagnostic utility, a standardized differentiation of pain subtypes that is independent of disease etiology may offer a unique opportunity to improve targeted analgesic treatment.
Resumo:
The ichthyoses are a heterogeneous group of monogenetically inherited disorders of cornification, and characterized clinically by scaling or hyperkeratosis. Historically, they were classified by clinical features and inheritance patterns. As a result of the recent molecular biological revolution, the ichthyoses are now recognized as comprising many diverse entities. Importantly, identical phenotypes may be caused by mutations in multiple genes, while mutations in a single gene may result in multiple and sometimes widely divergent phenotypes. The considerable complexity of this clinically and genetically heterogeneous group of disorders has prompted the need for a new classification. A classification that uses terminology based on a combination of the clinical and molecular genetic details, for instance loricrin keratoderma, is desirable. In this chapter we will use in principle the nosology adopted recently by an international group of experts at the First Ichthyosis Consensus Conference in Sorèz, France.
Resumo:
Introduction In my thesis I argue that economic policy is all about economics and politics. Consequently, analysing and understanding economic policy ideally has at least two parts. The economics part, which is centered around the expected impact of a specific policy on the real economy both in terms of efficiency and equity. The insights of this part point into which direction the fine-tuning of economic policies should go. However, fine-tuning of economic policies will be most likely subject to political constraints. That is why, in the politics part, a much better understanding can be gained by taking into account how the incentives of politicians and special interest groups as well as the role played by different institutional features affect the formation of economic policies. The first part and chapter of my thesis concentrates on the efficiency-related impact of economic policies: how does corporate income taxation in general, and corporate income tax progressivity in specific, affect the creation of new firms? Reduced progressivity and flat-rate taxes are in vogue. By 2009, 22 countries are operating flat-rate income tax systems, as do 7 US states and 14 Swiss cantons (for corporate income only). Tax reform proposals in the spirit of the "flat tax" model typically aim to reduce three parameters: the average tax burden, the progressivity of the tax schedule, and the complexity of the tax code. In joint work, Marius Brülhart and I explore the implications of changes in these three parameters on entrepreneurial activity, measured by counts of firm births in a panel of Swiss municipalities. Our results show that lower average tax rates and reduced complexity of the tax code promote firm births. Controlling for these effects, reduced progressivity inhibits firm births. Our reading of these results is that tax progressivity has an insurance effect that facilitates entrepreneurial risk taking. The positive effects of lower tax levels and reduced complexity are estimated to be significantly stronger than the negative effect of reduced progressivity. To the extent that firm births reflect desirable entrepreneurial dynamism, it is not the flattening of tax schedules that is key to successful tax reforms, but the lowering of average tax burdens and the simplification of tax codes. Flatness per se is of secondary importance and even appears to be detrimental to firm births. The second part of my thesis, which corresponds to the second and third chapter, concentrates on how economic policies are formed. By the nature of the analysis, these two chapters draw on a broader literature than the first chapter. Both economists and political scientists have done extensive research on how economic policies are formed. Thereby, researchers in both disciplines have recognised the importance of special interest groups trying to influence policy-making through various channels. In general, economists base their analysis on a formal and microeconomically founded approach, while abstracting from institutional details. In contrast, political scientists' frameworks are generally richer in terms of institutional features but lack the theoretical rigour of economists' approaches. I start from the economist's point of view. However, I try to borrow as much as possible from the findings of political science to gain a better understanding of how economic policies are formed in reality. In the second chapter, I take a theoretical approach and focus on the institutional policy framework to explore how interactions between different political institutions affect the outcome of trade policy in presence of special interest groups' lobbying. Standard political economy theory treats the government as a single institutional actor which sets tariffs by trading off social welfare against contributions from special interest groups seeking industry-specific protection from imports. However, these models lack important (institutional) features of reality. That is why, in my model, I split up the government into a legislative and executive branch which can both be lobbied by special interest groups. Furthermore, the legislative has the option to delegate its trade policy authority to the executive. I allow the executive to compensate the legislative in exchange for delegation. Despite ample anecdotal evidence, bargaining over delegation of trade policy authority has not yet been formally modelled in the literature. I show that delegation has an impact on policy formation in that it leads to lower equilibrium tariffs compared to a standard model without delegation. I also show that delegation will only take place if the lobby is not strong enough to prevent it. Furthermore, the option to delegate increases the bargaining power of the legislative at the expense of the lobbies. Therefore, the findings of this model can shed a light on why the U.S. Congress often practices delegation to the executive. In the final chapter of my thesis, my coauthor, Antonio Fidalgo, and I take a narrower approach and focus on the individual politician level of policy-making to explore how connections to private firms and networks within parliament affect individual politicians' decision-making. Theories in the spirit of the model of the second chapter show how campaign contributions from lobbies to politicians can influence economic policies. There exists an abundant empirical literature that analyses ties between firms and politicians based on campaign contributions. However, the evidence on the impact of campaign contributions is mixed, at best. In our paper, we analyse an alternative channel of influence in the shape of personal connections between politicians and firms through board membership. We identify a direct effect of board membership on individual politicians' voting behaviour and an indirect leverage effect when politicians with board connections influence non-connected peers. We assess the importance of these two effects using a vote in the Swiss parliament on a government bailout of the national airline, Swissair, in 2001, which serves as a natural experiment. We find that both the direct effect of connections to firms and the indirect leverage effect had a strong and positive impact on the probability that a politician supported the government bailout.
Resumo:
Is there a link between decentralized governance and conflict prevention? This article tries to answer the question by presenting the state of the art of the intersection of both concepts. Provided that social conflict is inevitable and given the appearance of new threats and types of violence, as well as new demands for security based on people (human security), our societies should focus on promoting peaceful changes. Through an extensive analysis of the existing literature and the study of several cases, this paper suggests that decentralized governance can contribute to these efforts by transforming conflicts, bringing about power-sharing and inclusion incentives of minority groups. Albeit the complexity of assessing its impact on conflict prevention, it can be contended that decentralized governance might have very positive effects on the reduction of causes that bring about conflicts due to its ability to foster the creation of war/violence preventors. More specifically, this paper argues that decentralization can have a positive impact on the so-called triggers and accelerators (short- and medium-term causes).
Resumo:
The study tested three analytic tools applied in SLA research (T-unit, AS-unit and Idea-unit) against FL learner monologic oral data. The objective was to analyse their effectiveness for the assessment of complexity of learners' academic production in English. The data were learners' individual productions gathered during the implementation of a CLIL teaching sequence on Natural Sciences in a Catalan state secondary school. The analysis showed that only AS-unit was easily applicable and highly effective in segmenting the data and taking complexity measures
Resumo:
Applied topically to larvae of Rhodnius prolixus Stal, Triatoma infestans (Klug) and Panstrongylus herreri Wygodzinsky, vectors of Trypanosoma cruzi, the causative agent of Chagas'disease, a synthetic, furan-containing anti-juvenile hormonal compound, 2-(2-ethoxyethoxy)ethyl furfuryl ether induced a variety of biomorphological alterations, including precocious metamorphosis into small adultoids with adult abdominal cuticle, ocelli, as well as rudimentary adultoid wings. Some adultoids died during ecdysis and were confined within the old cuticle. The extension of these biomorphological responses is discussed in terms of the complexity of the action of anti-juvenile hormonal compounds during the development of triatomines
Resumo:
Strategies to construct the physical map of the Trypanosoma cruzi nuclear genome have to capitalize on three main advantages of the parasite genome, namely (a) its small size, (b) the fact that all chromosomes can be defined, and many of them can be isolated by pulse field gel electrophoresis, and (c) the fact that simple Southern blots of electrophoretic karyotypes can be used to map sequence tagged sites and expressed sequence tags to chromosomal bands. A major drawback to cope with is the complexity of T. cruzi genetics, that hinders the construction of a comprehensive genetic map. As a first step towards physical mapping, we report the construction and partial characterization of a T. cruzi CL-Brener genomic library in yeast artificial chromosomes (YACs) that consists of 2,770 individual YACs with a mean insert size of 365 kb encompassing around 10 genomic equivalents. Two libraries in bacterial artificial chromosomes (BACs) have been constructed, BACI and BACII. Both libraries represent about three genome equivalents. A third BAC library (BAC III) is being constructed. YACs and BACs are invaluable tools for physical mapping. More generally, they have to be considered as a common resource for research in Chagas disease
Resumo:
Species distribution models (SDMs) are widely used to explain and predict species ranges and environmental niches. They are most commonly constructed by inferring species' occurrence-environment relationships using statistical and machine-learning methods. The variety of methods that can be used to construct SDMs (e.g. generalized linear/additive models, tree-based models, maximum entropy, etc.), and the variety of ways that such models can be implemented, permits substantial flexibility in SDM complexity. Building models with an appropriate amount of complexity for the study objectives is critical for robust inference. We characterize complexity as the shape of the inferred occurrence-environment relationships and the number of parameters used to describe them, and search for insights into whether additional complexity is informative or superfluous. By building 'under fit' models, having insufficient flexibility to describe observed occurrence-environment relationships, we risk misunderstanding the factors shaping species distributions. By building 'over fit' models, with excessive flexibility, we risk inadvertently ascribing pattern to noise or building opaque models. However, model selection can be challenging, especially when comparing models constructed under different modeling approaches. Here we argue for a more pragmatic approach: researchers should constrain the complexity of their models based on study objective, attributes of the data, and an understanding of how these interact with the underlying biological processes. We discuss guidelines for balancing under fitting with over fitting and consequently how complexity affects decisions made during model building. Although some generalities are possible, our discussion reflects differences in opinions that favor simpler versus more complex models. We conclude that combining insights from both simple and complex SDM building approaches best advances our knowledge of current and future species ranges.
Resumo:
Descriptive set theory is mainly concerned with studying subsets of the space of all countable binary sequences. In this paper we study the generalization where countable is replaced by uncountable. We explore properties of generalized Baire and Cantor spaces, equivalence relations and their Borel reducibility. The study shows that the descriptive set theory looks very different in this generalized setting compared to the classical, countable case. We also draw the connection between the stability theoretic complexity of first-order theories and the descriptive set theoretic complexity of their isomorphism relations. Our results suggest that Borel reducibility on uncountable structures is a model theoretically natural way to compare the complexity of isomorphism relations.
Resumo:
We give the first systematic study of strong isomorphism reductions, a notion of reduction more appropriate than polynomial time reduction when, for example, comparing the computational complexity of the isomorphim problem for different classes of structures. We show that the partial ordering of its degrees is quite rich. We analyze its relationship to a further type of reduction between classes of structures based on purely comparing for every n the number of nonisomorphic structures of cardinality at most n in both classes. Furthermore, in a more general setting we address the question of the existence of a maximal element in the partial ordering of the degrees.
Resumo:
In this paper we address the complexity of the analysis of water use in relation to the issue of sustainability. In fact, the flows of water in our planet represent a complex reality which can be studied using many different perceptions and narratives referring to different scales and dimensions of analysis. For this reason, a quantitative analysis of water use has to be based on analytical methods that are semantically open: they must be able to define what we mean with the term “water” when crossing different scales of analysis. We propose here a definition of water as a resource that deal with the many services it provides to humans and ecosystems. WE argue that water can fulfil so many of them since the element has many characteristics that allow for the resource to be labelled with different attributes, depending on the end use –such as drinkable. Since the services for humans and the functions for ecosystems associated with water flows are defined on different scales but still interconnected it is necessary to organize our assessment of water use across different hierarchical levels. In order to do so we define how to approach the study of water use in the Societal Metabolism, by proposing the Water Metabolism, tganized in three levels: societal level, ecosystem level and global level. The possible end uses we distinguish for the society are: personal/physiological use, household use, economic use. Organizing the study of “water use” across all these levels increases the usefulness of the quantitative analysis and the possibilities of finding relevant and comparable results. To achieve this result, we adapted a method developed to deal with multi-level, multi-scale analysis - the Multi-Scale Integrated Analysis of Societal and Ecosystem Metabolism (MuSIASEM) approach - to the analysis of water metabolism. In this paper, we discuss the peculiar analytical identity that “water” shows within multi-scale metabolic studies: water represents a flow-element when considering the metabolism of social systems (at a small scale, when describing the water metabolism inside the society) and a fund-element when considering the metabolism o ecosystems (at a larger scale when describing the water metabolism outside the society). The theoretical analysis is illustrated using two case which characterize the metabolic patterns regarding water use of a productive system in Catalonia and a water management policy in Andarax River Basin in Andalusia.