960 resultados para Syntactic Projection
Resumo:
Addressing the multitude of challenges in marine policy requires an integrated approach that considers the multitude of drivers, pressures, and interests, from several disciplinary angles. Scenarios are needed to harmonise the analyses of different components of the marine system, and to deal with the uncertainty and complexity of the societal and biogeophysical dynamics in the system. This study considers a set of socio-economic scenarios to (1) explore possible futures in relation to marine invasive species, outbreak forming species, and gradual changes in species distribution and productivity; and (2) harmonise the projection modelling performed within associated studies. The exercise demonstrates that developing interdisciplinary scenarios as developed in this study is particularly complicated due to (1) the wide variety in endogeneity or exogeneity of variables in the different analyses involved; (2) the dual role of policy decisions as variables in a scenario or decisions to be evaluated and compared to other decisions; and (3) the substantial difference in time scale between societal and physical drivers.
Resumo:
Accurate assessment of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and a methodology to quantify all major components of the global carbon budget, including their uncertainties, based on the combination of a range of data, algorithms, statistics, and model estimates and their interpretation by a broad scientific community. We discuss changes compared to previous estimates as well as consistency within and among components, alongside methodology and data limitations. CO2 emissions from fossil fuels and industry (EFF) are based on energy statistics and cement production data, while emissions from land-use change (ELUC), mainly deforestation, are based on combined evidence from land-cover-change data, fire activity associated with deforestation, and models. The global atmospheric CO2 concentration is measured directly and its rate of growth (GATM) is computed from the annual changes in concentration. The mean ocean CO2 sink (SOCEAN) is based on observations from the 1990s, while the annual anomalies and trends are estimated with ocean models. The variability in SOCEAN is evaluated with data products based on surveys of ocean CO2 measurements. The global residual terrestrial CO2 sink (SLAND) is estimated by the difference of the other terms of the global carbon budget and compared to results of independent dynamic global vegetation models forced by observed climate, CO2, and land-cover change (some including nitrogen–carbon interactions). We compare the mean land and ocean fluxes and their variability to estimates from three atmospheric inverse methods for three broad latitude bands. All uncertainties are reported as ±1σ, reflecting the current capacity to characterise the annual estimates of each component of the global carbon budget. For the last decade available (2005–2014), EFF was 9.0 ± 0.5 GtC yr−1, ELUC was 0.9 ± 0.5 GtC yr−1, GATM was 4.4 ± 0.1 GtC yr−1, SOCEAN was 2.6 ± 0.5 GtC yr−1, and SLAND was 3.0 ± 0.8 GtC yr−1. For the year 2014 alone, EFF grew to 9.8 ± 0.5 GtC yr−1, 0.6 % above 2013, continuing the growth trend in these emissions, albeit at a slower rate compared to the average growth of 2.2 % yr−1 that took place during 2005–2014. Also, for 2014, ELUC was 1.1 ± 0.5 GtC yr−1, GATM was 3.9 ± 0.2 GtC yr−1, SOCEAN was 2.9 ± 0.5 GtC yr−1, and SLAND was 4.1 ± 0.9 GtC yr−1. GATM was lower in 2014 compared to the past decade (2005–2014), reflecting a larger SLAND for that year. The global atmospheric CO2 concentration reached 397.15 ± 0.10 ppm averaged over 2014. For 2015, preliminary data indicate that the growth in EFF will be near or slightly below zero, with a projection of −0.6 [range of −1.6 to +0.5] %, based on national emissions projections for China and the USA, and projections of gross domestic product corrected for recent changes in the carbon intensity of the global economy for the rest of the world. From this projection of EFF and assumed constant ELUC for 2015, cumulative emissions of CO2 will reach about 555 ± 55 GtC (2035 ± 205 GtCO2) for 1870–2015, about 75 % from EFF and 25 % from ELUC. This living data update documents changes in the methods and data sets used in this new carbon budget compared with previous publications of this data set (Le Quéré et al., 2015, 2014, 2013). All observations presented here can be downloaded from the Carbon Dioxide Information Analysis Center (doi:10.3334/CDIAC/GCP_2015).
Resumo:
Pyramidal neurons (PyNs) in ‘higher’ brain are highly susceptible to acute stroke injury yet ‘lower’ brain regions better survive global ischemia, presumably because of better residual blood flow. Here we show that projection neurons in ‘lower’ brain regions of hypothalamus and brainstem intrinsically resist acute stroke-like injury independent of blood flow in the brain slice. In contrast `higher` projection neurons in neocortex, hippocampus, striatum and thalamus are highly susceptible. In live brain slices from rat deprived of oxygen and glucose (OGD), we imaged anoxic depolarization (AD) as it propagates through these regions. AD, the initial electrophysiological event of stroke, is a depolarizing front that drains residual energy in compromised gray matter. The extent of AD reliably determines ensuing damage in higher brain, but using whole-cell recordings we found that all CNS neurons do not generate a robust AD. Higher neurons generate strong AD and show no functional recovery in contrast to neurons in hypothalamus and brainstem that generate a weak and gradual AD. Most dramatically, lower neurons recover their membrane potential, input resistance and spike amplitude when oxygen and glucose is restored, while higher neurons do not. Following OGD, new recordings could be acquired in all lower (but not higher) brain regions, with some neurons even withstanding multiple OGD exposure. Two-photon laser scanning microscopy confirmed neuroprotection in lower, but not higher gray matter. Specifically pyramidal neurons swell and lose their dendritic spines post-OGD, whereas neurons in hypothalamus and brainstem display no such injury. Exposure to the Na+/K+ ATPase inhibitor ouabain (100 μM), induces depolarization similar to OGD in all cell types tested. Moreover, elevated [K+]o evokes spreading depression (SD), a milder version of AD, in higher brain but not hypothalamus or brainstem so weak AD correlates with the inability to generate SD. In summary, overriding the Na+/K+ pump using OGD, ouabain or elevated [K+]o evokes steep and robust depolarization of higher gray matter. We show that this important regional difference can be largely accounted for by the intrinsic properties of the resident neurons and that Na+/K+ ATPase pump efficiency is a major determining factor generating strong or weak spreading depolarizations.
Resumo:
We study properties of subspace lattices related to the continuity of the map Lat and the notion of reflexivity. We characterize various “closedness” properties in different ways and give the hierarchy between them. We investigate several properties related to tensor products of subspace lattices and show that the tensor product of the projection lattices of two von Neumann algebras, one of which is injective, is reflexive.
Self-consistent non-Markovian theory of a quantum-state evolution for quantum-information processing
Resumo:
We study non-Markovian decoherence phenomena by employing projection-operator formalism when a quantum system (a quantum bit or a register of quantum bits) is coupled to a reservoir. By projecting out the degree of freedom of the reservoir, we derive a non-Markovian master equation for the system, which is reduced to a Lindblad master equation in Markovian limit, and obtain the operator sum representation for the time evolution. It is found that the system is decohered slower in the non- Markovian reservoir than the Markovian because the quantum information of the system is memorized in the non-Markovian reservoir. We discuss the potential importance of non-Markovian reservoirs for quantum-information processing.
Resumo:
A comparative study of high harmonic generation (HHG) by atoms and ions with active p-electrons is carried out in the theoretical framework of the rescattering mechanism. The substate with m(l) = 0, i.e. zero orbital momentum projection along the electric vector of a linearly polarized laser wave, is found to give the major contribution to the HHG rate. Our calculations for HHG by an H atom in an excited 2p-state demonstrate that the rate for recombination into a final state with a different value of m(l) (= +/- 1), is higher for lower harmonic orders N, while for higher N (beyond the plateau domain) the difference vanishes. For species with closed electron shells, the m(l)-changing transitions are forbidden by the Pauli exclusion principle. We report absolute HHG rates for halogen ions and noble gas atoms at various intensities. These results demonstrate that the Coulomb binding potential of the atoms considerably enhances both the ionization and recombination steps in the rescattering process. However, the weak binding energy of the anions allows lower orders of HHG to be efficiently produced at relatively low intensities, from which we conclude that observation of HHG by an anion is experimentally feasible.
Resumo:
Protons accelerated by a picosecond laser pulse have been used to radiograph a 500 mu m diameter capsule, imploded with 300 J of laser light in 6 symmetrically incident beams of wavelength 1.054 mu m and pulse length 1 ns. Point projection proton backlighting was used to characterize the density gradients at discrete times through the implosion. Asymmetries were diagnosed both during the early and stagnation stages of the implosion. Comparison with analytic scattering theory and simple Monte Carlo simulations were consistent with a 3 +/- 1 g/cm(3) core with diameter 85 +/- 10 mu m. Scaling simulations show that protons > 50 MeV are required to diagnose asymmetry in ignition scale conditions.
Resumo:
Selection power is taken as the fundamental value for information retrieval systems. Selection power is regarded as produced by selection labor, which itself separates historically into description and search labor. As forms of mental labor, description and search labor participate in the conditions for labor and for mental labor. Concepts and distinctions applicable to physical and mental labor are indicated, introducing the necessity of labor for survival, the idea of technology as a human construction, and the possibility of the transfer of human labor to technology. Distinctions specific to mental labor, particular between semantic and syntactic labor, are introduced. Description labor is exemplified by cataloging, classification, and database description, can be more formally understood as the labor involved in the transformation of objects for description into searchable descriptions, and is also understood to include interpretation. The costs of description labor are discussed. Search labor is conceived as the labor expended in searching systems. For both description and search labor, there has been a progressive reduction in direct human labor, with its syntactic aspects transferred to technology, effectively compelled by the high relative costs of direct human labor compared to machine processes.
Resumo:
The International Brigades are typically viewed as a fighting force whose impetus came from the Comintern, and thus from within the walls of the Kremlin. If the assumption is essentially correct, the broader relation between Stalin’s USSR and the IB has received little attention. This chapter constitutes an empirically-based study of the Soviet role not only in the formation of the IB, but of the Red Army’s collaboration with IB units, and Moscow’s role in the climax and denouement of the brigadistas’ Spanish experience. This study’s principal conclusion is twofold: First, that the creation and sustenance of the IB was part of Stalin’s goal of linking the Loyalist cause with that of the Soviet Union and international communism, a component of a larger geo-strategic gamble which sought to create united opposition to the fascist menace, one which might eventually bring Moscow and the West into a closer alliance. The second conclusion is that the IB, like the broader projection of Soviet power and influence into the Spanish theater, was an overly ambitious operational failure whose abortive retreat is indicative of the basic weakness of the Stalinist regime in the years prior to the Second World War.
Resumo:
Application of a parallel-projection inversion technique to z-scan spectra of multiply charged xenon and krypton ions, obtained by non-resonant field ionization of neutral targets, has for the first time permitted the direct observation of intensity-dependent ionization probabilities. These ionization efficiency curves have highlighted the presence of structure in the tunnelling regime, previously unobserved under full-volume techniques.
Resumo:
Information retrieval in the age of Internet search engines has become part of ordinary discourse and everyday practice: "Google" is a verb in common usage. Thus far, more attention has been given to practical understanding of information retrieval than to a full theoretical account. In Human Information Retrieval, Julian Warner offers a comprehensive overview of information retrieval, synthesizing theories from different disciplines (information and computer science, librarianship and indexing, and information society discourse) and incorporating such disparate systems as WorldCat and Google into a single, robust theoretical framework. There is a need for such a theoretical treatment, he argues, one that reveals the structure and underlying patterns of this complex field while remaining congruent with everyday practice. Warner presents a labor theoretic approach to information retrieval, building on his previously formulated distinction between semantic and syntactic mental labor, arguing that the description and search labor of information retrieval can be understood as both semantic and syntactic in character. Warner's information science approach is rooted in the humanities and the social sciences but informed by an understanding of information technology and information theory. The chapters offer a progressive exposition of the topic, with illustrative examples to explain the concepts presented. Neither narrowly practical nor largely speculative, Human Information Retrieval meets the contemporary need for a broader treatment of information and information systems.
Resumo:
Let M be the Banach space of sigma-additive complex-valued measures on an abstract measurable space. We prove that any closed, with respect to absolute continuity norm-closed, linear subspace L of M is complemented and describe the unique complement, projection onto L along which has norm 1. Using this fact we prove a decomposition theorem, which includes the Jordan decomposition theorem, the generalized Radon-Nikodym theorem and the decomposition of measures into decaying and non-decaying components as particular cases. We also prove an analog of the Jessen-Wintner purity theorem for our decompositions.
Resumo:
In this paper, by investigating the influence of source/drain extension region engineering (also known as gate-source/drain underlap) in nanoscale planar double gate (DG) SOI MOSFETs, we offer new insights into the design of future nanoscale gate-underlap DG devices to achieve ITRS projections for high performance (HP), low standby power (LSTP) and low operating power (LOP) logic technologies. The impact of high-kappa gate dielectric, silicon film thickness, together with parameters associated with the lateral source/drain doping profile, is investigated in detail. The results show that spacer width along with lateral straggle can not only effectively control short-channel effects, thus presenting low off-current in a gate underlap device, but can also be optimized to achieve lower intrinsic delay and higher on-off current ratio (I-on/I-off). Based on the investigation of on-current (I-on), off-current (I-off), I-on/I-off, intrinsic delay (tau), energy delay product and static power dissipation, we present design guidelines to select key device parameters to achieve ITRS projections. Using nominal gate lengths for different technologies, as recommended from ITRS specification, optimally designed gate-underlap DG MOSFETs with a spacer-to-straggle (s/sigma) ratio of 2.3 for HP/LOP and 3.2 for LSTP logic technologies will meet ITRS projection. However, a relatively narrow range of lateral straggle lying between 7 to 8 nm is recommended. A sensitivity analysis of intrinsic delay, on-current and off-current to important parameters allows a comparative analysis of the various design options and shows that gate workfunction appears to be the most crucial parameter in the design of DG devices for all three technologies. The impact of back gate misalignment on I-on, I-off and tau is also investigated for optimized underlap devices.
Resumo:
Multi-Mev proton beams generated by target normal sheath acceleration (TNSA) during the interaction of an ultra intense laser beam (Ia parts per thousand yen10(19) W/cm(2)) with a thin metallic foil (thickness of the order of a few tens of microns) are particularly suited as a particle probe for laser plasma experiments. The proton imaging technique employs a laser-driven proton beam in a point-projection imaging scheme as a diagnostic tool for the detection of electric fields in such experiments. The proton probing technique has been applied in experiments of relevance to inertial confinement fusion (ICF) such as laser heated gasbags and laser-hohlraum experiments. The data provides direct information on the onset of laser beam filamentation and on the plasma expansion in the hohlraum's interior, and confirms the suitability and usefulness of this technique as an ICF diagnostic.
Resumo:
The objective of this paper is to describe and evaluate the application of the Text Encoding Initiative (TEI) Guidelines to a corpus of oral French, this being the first corpus of oral French where the TEI has been used. The paper explains the purpose of the corpus, both in creating a specialist corpus of néo-contage that will broaden the range of oral corpora available, and, more importantly, in creating a dataset to explore a variety of oral French that has a particularly interesting status in terms of factors such as conception orale/écrite, réalisation médiale and comportement communicatif (Koch and Oesterreicher 2001). The linguistic phenomena to be encoded are both stylistic (speech and thought presentation) and syntactic (negation, detachment, inversion), and all represent areas where previous research has highlighted the significance of factors such as medium, register and discourse type, as well as a host of linguistic factors (syntactic, phonetic, lexical). After a discussion of how a tagset can be designed and applied within the TEI to encode speech and thought presentation, negation, detachment and inversion, the final section of the paper evaluates the benefits and possible drawbacks of the methodology offered by the TEI when applied to a syntactic and stylistic markup of an oral corpus.