926 resultados para Analytical Ultracentrifugation
Resumo:
Experts working on behalf of international development organisations need better tools to assist land managers in developing countriesmaintain their livelihoods, as climate change puts pressure on the ecosystemservices that they depend upon. However, current understanding of livelihood vulnerability to climate change is based on a fractured and disparate set of theories andmethods. This reviewtherefore combines theoretical insights from sustainable livelihoods analysis with other analytical frameworks (including the ecosystem services framework, diffusion theory, social learning, adaptive management and transitions management) to assess the vulnerability of rural livelihoods to climate change. This integrated analytical framework helps diagnose vulnerability to climate change,whilst identifying and comparing adaptation options that could reduce vulnerability, following four broad steps: i) determine likely level of exposure to climate change, and how climate change might interact with existing stresses and other future drivers of change; ii) determine the sensitivity of stocks of capital assets and flows of ecosystem services to climate change; iii) identify factors influencing decisions to develop and/or adopt different adaptation strategies, based on innovation or the use/substitution of existing assets; and iv) identify and evaluate potential trade-offs between adaptation options. The paper concludes by identifying interdisciplinary research needs for assessing the vulnerability of livelihoods to climate change.
Resumo:
It is widely known that people utter an untruth from time to time. Our ability to detect and recognize deception and lies and, in the next step, to respond appropriately is not very far-reaching. We were interested in finding out if distrust triggers non-routine, analytical thought processes and therefore improves the detection of lies without dismissing the truth. We conducted two experiments to investigate the influence of an unconscious form of distrust on our thought processes. In the first experiment, participants had to determine whether a report is truthful or false. The aim of this second experiment was to investigate if this enhanced ability to detect a falsified report correctly is based on analytical thinking taking place. To examine our assumptions, we applied the paradigm of belief bias. The results of the second experiments strongly point out the fact that an unconscious form of distrust triggers and fosters analytical thinking.
Resumo:
This paper introduces a mobile application (app) as the first part of an interactive framework. The framework enhances the inter-action between cities and their citizens, introducing the Fuzzy Analytical Hierarchy Process (FAHP) as a potential information acquisition method to improve existing citizen management en-deavors for cognitive cities. Citizen management is enhanced by advanced visualization using Fuzzy Cognitive Maps (FCM). The presented app takes fuzziness into account in the constant inter-action and continuous development of communication between cities or between certain of their entities (e.g., the tax authority) and their citizens. A transportation use case is implemented for didactical reasons.
Resumo:
Within the context of exoplanetary atmospheres, we present a comprehensive linear analysis of forced, damped, magnetized shallow water systems, exploring the effects of dimensionality, geometry (Cartesian, pseudo-spherical, and spherical), rotation, magnetic tension, and hydrodynamic and magnetic sources of friction. Across a broad range of conditions, we find that the key governing equation for atmospheres and quantum harmonic oscillators are identical, even when forcing (stellar irradiation), sources of friction (molecular viscosity, Rayleigh drag, and magnetic drag), and magnetic tension are included. The global atmospheric structure is largely controlled by a single key parameter that involves the Rossby and Prandtl numbers. This near-universality breaks down when either molecular viscosity or magnetic drag acts non-uniformly across latitude or a poloidal magnetic field is present, suggesting that these effects will introduce qualitative changes to the familiar chevron-shaped feature witnessed in simulations of atmospheric circulation. We also find that hydrodynamic and magnetic sources of friction have dissimilar phase signatures and affect the flow in fundamentally different ways, implying that using Rayleigh drag to mimic magnetic drag is inaccurate. We exhaustively lay down the theoretical formalism (dispersion relations, governing equations, and time-dependent wave solutions) for a broad suite of models. In all situations, we derive the steady state of an atmosphere, which is relevant to interpreting infrared phase and eclipse maps of exoplanetary atmospheres. We elucidate a pinching effect that confines the atmospheric structure to be near the equator. Our suite of analytical models may be used to develop decisively physical intuition and as a reference point for three-dimensional magnetohydrodynamic simulations of atmospheric circulation.
Resumo:
We present a comprehensive analytical study of radiative transfer using the method of moments and include the effects of non-isotropic scattering in the coherent limit. Within this unified formalism, we derive the governing equations and solutions describing two-stream radiative transfer (which approximates the passage of radiation as a pair of outgoing and incoming fluxes), flux-limited diffusion (which describes radiative transfer in the deep interior) and solutions for the temperature-pressure profiles. Generally, the problem is mathematically under-determined unless a set of closures (Eddington coefficients) is specified. We demonstrate that the hemispheric (or hemi-isotropic) closure naturally derives from the radiative transfer equation if energy conservation is obeyed, while the Eddington closure produces spurious enhancements of both reflected light and thermal emission. We concoct recipes for implementing two-stream radiative transfer in stand-alone numerical calculations and general circulation models. We use our two-stream solutions to construct toy models of the runaway greenhouse effect. We present a new solution for temperature-pressure profiles with a non-constant optical opacity and elucidate the effects of non-isotropic scattering in the optical and infrared. We derive generalized expressions for the spherical and Bond albedos and the photon deposition depth. We demonstrate that the value of the optical depth corresponding to the photosphere is not always 2/3 (Milne's solution) and depends on a combination of stellar irradiation, internal heat and the properties of scattering both in optical and infrared. Finally, we derive generalized expressions for the total, net, outgoing and incoming fluxes in the convective regime.
Resumo:
The core issues comparative territorial politics addresses are how and why territory is used to delimit, maintain, or create political power; and with what kind of consequences for efficiency (output) and legitimacy (input). The aim of this article is to integrate various research strands into the comparative study of territorial politics, with federal studies at its core. As an example of a conceptual payoff, ‘political territoriality’ refers the observer to three dimensions of the strategic use of areal boundaries for political power. By focusing on territory as a key variable of political systems, the actors, processes and institutions are first analytically separated and continuously measured, enhancing internal validity, and then theoretically integrated, which allows more valid external inferences than classic, legal-institutionalist federal studies. After discussing the boundaries and substance of comparative territorial politics as a federal discipline, political territoriality is developed towards an analytical framework applicable to politics at any governmental level. The claims are modest: political territoriality does not serve so much as an explanatory concept as rather an ‘attention-directing device’ for federal studies.
Resumo:
The fuzzy analytical network process (FANP) is introduced as a potential multi-criteria-decision-making (MCDM) method to improve digital marketing management endeavors. Today’s information overload makes digital marketing optimization, which is needed to continuously improve one’s business, increasingly difficult. The proposed FANP framework is a method for enhancing the interaction between customers and marketers (i.e., involved stakeholders) and thus for reducing the challenges of big data. The presented implementation takes realities’ fuzziness into account to manage the constant interaction and continuous development of communication between marketers and customers on the Web. Using this FANP framework, the marketers are able to increasingly meet the varying requirements of their customers. To improve the understanding of the implementation, advanced visualization methods (e.g., wireframes) are used.
Resumo:
Oligonucleotides comprising unnatural building blocks, which interfere with the translation machinery, have gained increased attention for the treatment of gene-related diseases (e.g. antisense, RNAi). Due to structural modifications, synthetic oligonucleotides exhibit increased biostability and bioavailability upon administration. Consequently, classical enzyme-based sequencing methods are not applicable to their sequence elucidation and verification. Tandem mass spectrometry is the method of choice for performing such tasks, since gas-phase dissociation is not restricted to natural nucleic acids. However, tandem mass spectrometric analysis can generate product ion spectra of tremendous complexity, as the number of possible fragments grows rapidly with increasing sequence length. The fact that structural modifications affect the dissociation pathways greatly increases the variety of analytically valuable fragment ions. The gas-phase dissociation of oligonucleotides is characterized by the cleavage of one of the four bonds along the phosphodiester chain, by the accompanying loss of nucleases, and by the generation of internal fragments due to secondary backbone cleavage. For example, an 18-mer oligonucleotide yields a total number of 272’920 theoretical fragment ions. In contrast to the processing of peptide product ion spectra, which nowadays is highly automated, there is a lack of tools assisting the interpretation of oligonucleotide data. The existing web-based and stand-alone software applications are primarily designed for the sequence analysis of natural nucleic acids, but do not account for chemical modifications and adducts. Consequently, we developed a software to support the interpretation of mass spectrometric data of natural and modified nucleic acids and their adducts with chemotherapeutic agents.
Resumo:
Societies develop ways of making decisions regarding collective problems, thereby creating norms, rules, and institutions; this is what governance is about. In policy research, governance has become an important focus of attention; but debates show a lack of clarity at the conceptual level and a confusion between the use of the concept for prescriptive and analytical purposes. The present article is based on the hypothesis that using a clarified, non-normative governance perspective in policy research can contribute to an improved understanding of political processes, including formal and unrecognised ones, those embedded in larger and smaller social systems, as well as both vertical and horizontal political arrangements. The paper is the result of a collaborative engagement with the concept of governance within several networks, leading to the development of the Governance Analytical Framework (GAF). The GAF is a practical methodology for investigating governance processes, based on five analytical tools: problems, actors, social norms, processes, and nodal points. Besides describing the conceptual sources and analytical purpose of these five tools, the paper presents examples of how the GAF can be operationalised.
Resumo:
OBJECTIVE To validate a radioimmunoassay for measurement of procollagen type III amino terminal propeptide (PIIINP) concentrations in canine serum and bronchoalveolar lavage fluid (BALF) and investigate the effects of physiologic and pathologic conditions on PIIINP concentrations. SAMPLE POPULATION Sera from healthy adult (n = 70) and growing dogs (20) and dogs with chronic renal failure (CRF; 10), cardiomyopathy (CMP; 12), or degenerative valve disease (DVD; 26); and sera and BALF from dogs with chronic bronchopneumopathy (CBP; 15) and healthy control dogs (10 growing and 9 adult dogs). PROCEDURE A radioimmunoassay was validated, and a reference range for serum PIIINP (S-PIIINP) concentration was established. Effects of growth, age, sex, weight, CRF, and heart failure on S-PIIINP concentration were analyzed. In CBP-affected dogs, S-PIIINP and BALF-PIIINP concentrations were evaluated. RESULTS The radioimmunoassay had good sensitivity, linearity, precision, and reproducibility and reasonable accuracy for measurement of S-PIIINP and BALF-PIIINP concentrations. The S-PIIINP concentration reference range in adult dogs was 8.86 to 11.48 mug/L. Serum PIIINP concentration correlated with weight and age. Growing dogs had significantly higher S-PIIINP concentrations than adults, but concentrations in CRF-, CMP-, DVD-, or CBP-affected dogs were not significantly different from control values. Mean BALF-PIIINP concentration was significantly higher in CBP-affected dogs than in healthy adults. CONCLUSIONS AND CLINICAL RELEVANCE In dogs, renal or cardiac disease or CBP did not significantly affect S-PIIINP concentration; dogs with CBP had high BALF-PIIINP concentrations. Data suggest that the use of PIIINP as a marker of pathologic fibrosis might be limited in growing dogs.
Resumo:
Background: The individual risk of developing psychosis after being tested for clinical high-risk (CHR) criteria (posttest risk of psychosis) depends on the underlying risk of the disease of the population from which the person is selected (pretest risk of psychosis), and thus on recruitment strategies. Yet, the impact of recruitment strategies on pretest risk of psychosis is unknown. Methods: Meta-analysis of the pretest risk of psychosis in help-seeking patients selected to undergo CHR assessment: total transitions to psychosis over the pool of patients assessed for potential risk and deemed at risk (CHR+) or not at risk (CHR−). Recruitment strategies (number of outreach activities per study, main target of outreach campaign, and proportion of self-referrals) were the moderators examined in meta-regressions. Results: 11 independent studies met the inclusion criteria, for a total of 2519 (CHR+: n = 1359; CHR−: n = 1160) help-seeking patients undergoing CHR assessment (mean follow-up: 38 months). The overall meta-analytical pretest risk for psychosis in help-seeking patients was 15%, with high heterogeneity (95% CI: 9%–24%, I 2 = 96, P < .001). Recruitment strategies were heterogeneous and opportunistic. Heterogeneity was largely explained by intensive (n = 11, β = −.166, Q = 9.441, P = .002) outreach campaigns primarily targeting the general public (n = 11, β = −1.15, Q = 21.35, P < .001) along with higher proportions of self-referrals (n = 10, β = −.029, Q = 4.262, P = .039), which diluted pretest risk for psychosis in patients undergoing CHR assessment. Conclusions: There is meta-analytical evidence for overall risk enrichment (pretest risk for psychosis at 38monhts = 15%) in help-seeking samples selected for CHR assessment as compared to the general population (pretest risk of psychosis at 38monhts=0.1%). Intensive outreach campaigns predominantly targeting the general population and a higher proportion of self-referrals diluted the pretest risk for psychosis.
Resumo:
This paper shows that optimal policy and consistent policy outcomes require the use of control-theory and game-theory solution techniques. While optimal policy and consistent policy often produce different outcomes even in a one-period model, we analyze consistent policy and its outcome in a simple model, finding that the cause of the inconsistency with optimal policy traces to inconsistent targets in the social loss function. As a result, the central bank should adopt a loss function that differs from the social loss function. Carefully designing the central bank s loss function with consistent targets can harmonize optimal and consistent policy. This desirable result emerges from two observations. First, the social loss function reflects a normative process that does not necessarily prove consistent with the structure of the microeconomy. Thus, the social loss function cannot serve as a direct loss function for the central bank. Second, an optimal loss function for the central bank must depend on the structure of that microeconomy. In addition, this paper shows that control theory provides a benchmark for institution design in a game-theoretical framework.