992 resultados para PAIR ANNIHILATION PROCESS
Resumo:
In this brief, we present a new circuit technique to generate the sigmoid neuron activation function (NAF) and its derivative (DNAF). The circuit makes use of transistor asymmetry in cross-coupled differential pair to obtain the derivative. The asymmetry is introduced through external control signal, as and when required. This results in the efficient utilization of the hard-ware by realizing NAF and DNAF using the same building blocks. The operation of the circuit is presented in the subthreshold region for ultra low-power applications. The proposed circuit has been experimentally prototyped and characterized as a proof of concept on the 1.5-mum AMI technology.
Resumo:
This thesis proposes that national or ethnic identity is an important and overlooked resource in conflict resolution. Usually ethnic identity is seen both in international relations and in social psychology as something that fuels the conflict. Using grounded theory to analyze data from interactive problem-solving workshops between Palestinians and Israelis a theory about the role of national identity in turning conflict into protracted conflict is developed. Drawing upon research from, among others, social identity theory, just world theory and prejudice it is argued that national identity is a prime candidate to provide the justification of a conflict party’s goals and the dehumanization of the other necessary to make a conflict protracted. It is not the nature of national identity itself that lets it perform this role but rather the ability to mobilize a constituency for social action (see Stürmer, Simon, Loewy, & Jörger, 2003). Reicher & Hopkins (1996) have demonstrated that national identity is constructed by political entrepreneurs to further their cause, even if this construction is not a conscious one. Data from interactive problem-solving workshops suggest that the possibility of conflict resolution is actually seen by participants as a direct threat of annihilation. Understanding the investment necessary to make conflict protracted this reaction seems plausible. The justification for ones actions provided by national identity makes the conflict an integral part of a conflict party’s identity. Conflict resolution, it is argued, is therefore a threat to the very core of the current national identity. This may explain why so many peace agreements have failed to provide the hoped for resolution of conflict. But if national identity is being used in a constructionist way to attain political goals, a political project of conflict resolution, if it is conscious of the constructionist process, needs to develop a national identity that is independent of conflict and therefore able to accommodate conflict resolution. From this understanding it becomes clear why national identity needs to change, i.e. be disarmed, if conflict resolution is to be successful. This process of disarmament is theorized to be similar to the process of creating and sustaining protracted conflict. What shape and function this change should have is explored from the understanding of the role of national identity in supporting conflict. Ideas how track-two diplomacy efforts, such as the interactive problem-solving workshop, could integrate a process by both conflict parties to disarm their respective identities are developed.
Resumo:
The fluctuation of the distance between a fluorescein-tyrosine pair within a single protein complex was directly monitored in real time by photoinduced electron transfer and found to be a stationary, time-reversible, and non-Markovian Gaussian process. Within the generalized Langevin equation formalism, we experimentally determine the memory kernel K(t), which is proportional to the autocorrelation function of the random fluctuating force. K(t) is a power-law decay, t(-0.51 +/- 0.07) in a broad range of time scales (10(-3)-10 s). Such a long-time memory effect could have implications for protein functions.
Resumo:
In this thesis I examine one commonly used class of methods for the analytic approximation of cellular automata, the so-called local cluster approximations. This class subsumes the well known mean-field and pair approximations, as well as higher order generalizations of these. While a straightforward method known as Bayesian extension exists for constructing cluster approximations of arbitrary order on one-dimensional lattices (and certain other cases), for higher-dimensional systems the construction of approximations beyond the pair level becomes more complicated due to the presence of loops. In this thesis I describe the one-dimensional construction as well as a number of approximations suggested for higher-dimensional lattices, comparing them against a number of consistency criteria that such approximations could be expected to satisfy. I also outline a general variational principle for constructing consistent cluster approximations of arbitrary order with minimal bias, and show that the one-dimensional construction indeed satisfies this principle. Finally, I apply this variational principle to derive a novel consistent expression for symmetric three cell cluster frequencies as estimated from pair frequencies, and use this expression to construct a quantitatively improved pair approximation of the well-known lattice contact process on a hexagonal lattice.
Resumo:
A measurement of the $\ttbar$ production cross section in $\ppbar$ collisions at $\sqrt{{\rm s}}$ = 1.96 TeV using events with two leptons, missing transverse energy, and jets is reported. The data were collected with the CDF II Detector. The result in a data sample corresponding to an integrated luminosity 2.8 fb$^{-1}$ is: $\sigma_{\ttbar}$ = 6.27 $\pm$ 0.73(stat) $\pm$ 0.63(syst) $\pm$ 0.39(lum) pb. for an assumed top mass of 175 GeV/$c^{2}$.
Resumo:
We present the results of a search for pair production of the supersymmetric partner of the top quark (the stop quark $\tilde{t}_{1}$) decaying to a $b$-quark and a chargino $\chargino$ with a subsequent $\chargino$ decay into a neutralino $\neutralino$, lepton $\ell$, and neutrino $\nu$. Using a data sample corresponding to 2.7 fb$^{-1}$ of integrated luminosity of $p\bar{p}$ collisions at $\sqrt{s} = 1.96$ TeV collected by the CDF II detector, we reconstruct the mass of candidate stop events and fit the observed mass spectrum to a combination of standard model processes and stop quark signal. We find no evidence for $\pairstop$ production and set 95% C.L. limits on the masses of the stop quark and the neutralino for several values of the chargino mass and the branching ratio ${\cal B}(\chargino\to\neutralino\ell^{\pm}\nu)$.
Resumo:
We report the most restrictive direct limits on masses of fourth-generation down-type quarks b′, and quarklike composite fermions (B or T5/3), decaying promptly to tW∓. We search for a significant excess of events with two same-charge leptons (e, μ), several hadronic jets, and missing transverse energy. An analysis of data from pp̅ collisions with an integrated luminosity of 2.7 fb-1 collected with the CDF II detector at Fermilab yields no evidence for such a signal, setting mass limits mb′, mB>338 GeV/c2 and mT5/3>365 GeV/c2 at 95% confidence level.
Resumo:
We report the most restrictive direct limits on masses of fourth-generation down-type quarks $b^{\prime}$, and quark-like composite fermions ($B$ or $T_{5/3}$), decaying promptly to $t W^{\mp}$. We search for a significant excess of events with two same-charge leptons ($e$, $\mu$), several hadronic jets, and missing transverse energy. An analysis of data from $p\overline{p}$ collisions with an integrated luminosity of 2.7 fb$^{-1}$ collected with the CDF II detector at Fermilab yields no evidence for such a signal, setting mass limits $m_{b^{\prime}}, m_{B} >$ 338 $\mathrm{GeV}/c^2$ and $m_{T_{5/3}} >$ 365 $\mathrm{GeV}/c^2$ at 95% confidence level.
Resumo:
The (overall trans) addition of hydrogen chloride to cyclohex-1- enecarbonitrile in anhydrous alcoholic media proceeds to give cis-2-chlorocyclohexanecarboxylate (together with some cis-2- chlorocyclohexanecarboxamide): no corresponding products with the trans-configuration are detectable. In anhydrous ether the addition proceeds to give a single isomer, presumably cis-, of 2-chlorocyclohexanecarbonitrile, indicating that the configuration of the products may not be equilibrium-controlled in alcoholic media. An examination of the steric factors indicates that the transition state for protonation of the presumed intermediate, 2-chlorocyclohexylidenemethylideneimine, leading to cis-product is favoured if interaction between the lateral π-orbital of the C-N double bond and the lone-pairs on the chlorine atom at the 2-position is large. Consideration of interactions in the transition states meets Zimmerman's criticism that invoking A1, 3 interaction existing in ground states to explain product configuration takes insufficient account of the Curtin-Hammett principle.
Resumo:
Crystalline Bi5NbO10 nanoparticles have been achieved through a modified sol–gel process using a mixture of ethylenediamine and ethanolamine as a solvent. The Bi5NbO10 nanoparticles were characterized by X-ray diffraction (XRD), differential scanning calorimetry/thermogravimetry (DSC/TG), Fourier transform infrared spectroscopy (FT-IR), transmission electron microscopy (TEM) and Raman spectroscopy. The results showed that well-dispersed 5–60 nm Bi5NbO10 nanoparticles were prepared through heat-treating the precursor at 650 °C and the high density pellets were obtained at temperatures lower than those commonly employed. The frequency and temperature dependence of the dielectric constant and the electrical conductivity of the Bi5NbO10 solid solutions were investigated in the 0.1 Hz to 1 MHz frequency range. Two distinct relaxation mechanisms were observed in the plots of dielectric loss and the imaginary part of impedance (Z″) versus frequency in the temperature range of 200–350 °C. The dielectric constant and the loss in the low frequency regime were electrode dependent. The ionic conductivity of Bi5NbO10 solid solutions at 700 °C is 2.86 Ω−1 m−1 which is in same order of magnitude for Y2O3-stabilized ZrO2 ceramics at same temperature. These results suggest that Bi5NbO10 is a promising material for an oxygen ion conductor.
Resumo:
The ProFacil model is a generic process model defined as a framework model showing the links between the facilities management process and the building end user’s business process. The purpose of using the model is to support more detailed process modelling. The model has been developed using the IDEF0 modelling method. The ProFacil model describes business activities from the generalized point of view as management-, support-, and core processes and their relations. The model defines basic activities in the provision of a facility. Examples of these activities are “operate facilities”, “provide new facilities”, “provide re-build facilities”, “provide maintained facilities” and “perform dispose of facilities”. These are all generic activities providing a basis for a further specialisation of company specific FM activities and their tasks. A facilitator can establish a specialized process model using the ProFacil model and interacting with company experts to describe their company’s specific processes. These modelling seminars or interviews will be done in an informal way, supported by the high-level process model as a common reference.
Resumo:
A model of the information and material activities that comprise the overall construction process is presented, using the SADT activity modelling methodology. The basic model is further refined into a number of generic information handling activities such as creation of new information, information search and retrieval, information distribution and person-to-person communication. The viewpoint could be described as information logistics. This model is then combined with a more traditional building process model, consisting of phases such as design and construction. The resulting two-dimensional matrix can be used for positioning different types of generic IT-tools or construction specific applications. The model can thus provide a starting point for a discussion of the application of information and communication technology in construction and for measurements of the impacts of IT on the overall process and its related costs.
Resumo:
The industry foundation classes (IFC) file format is one of the most complex and ambitious IT standardization projects currently being undertaken in any industry, focusing on the development of an open and neutral standard for exchanging building model data. Scientific literature related to the IFC standard has dominantly been technical so far; research looking at the IFC standard from an industry standardization per- spective could offer valuable new knowledge for both theory and practice. This paper proposes the use of IT standardization and IT adoption theories, supported by studies done within construction IT, to lay a theoretical foundation for further empirical analysis of the standardization process of the IFC file format.