873 resultados para Theory of nuclear architecture


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article seeks to contribute to the illumination of the so-called 'paradox of voting' using the German Bundestag elections of 1998 as an empirical case. Downs' model of voter participation will be extended to include elements of the theory of subjective expected utility (SEU). This will allow a theoretical and empirical exploration of the crucial mechanisms of individual voters' decisions to participate, or abstain from voting, in the German general election of 1998. It will be argued that the infinitely low probability of an individual citizen's vote to decide the election outcome will not necessarily reduce the probability of electoral participation. The empirical analysis is largely based on data from the ALLBUS 1998. It confirms the predictions derived from SEU theory. The voters' expected benefits and their subjective expectation to be able to influence government policy by voting are the crucial mechanisms to explain participation. By contrast, the explanatory contribution of perceived information and opportunity costs is low.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Researchers suggest that personalization on the Semantic Web adds up to a Web 3.0 eventually. In this Web, personalized agents process and thus generate the biggest share of information rather than humans. In the sense of emergent semantics, which supplements traditional formal semantics of the Semantic Web, this is well conceivable. An emergent Semantic Web underlying fuzzy grassroots ontology can be accomplished through inducing knowledge from users' common parlance in mutual Web 2.0 interactions [1]. These ontologies can also be matched against existing Semantic Web ontologies, to create comprehensive top-level ontologies. On the Web, if augmented with information in the form of restrictions andassociated reliability (Z-numbers) [2], this collection of fuzzy ontologies constitutes an important basis for an implementation of Zadeh's restriction-centered theory of reasoning and computation (RRC) [3]. By considering real world's fuzziness, RRC differs from traditional approaches because it can handle restrictions described in natural language. A restriction is an answer to a question of the value of a variable such as the duration of an appointment. In addition to mathematically well-defined answers, RRC can likewise deal with unprecisiated answers as "about one hour." Inspired by mental functions, it constitutes an important basis to leverage present-day Web efforts to a natural Web 3.0. Based on natural language information, RRC may be accomplished with Z-number calculation to achieve a personalized Web reasoning and computation. Finally, through Web agents' understanding of natural language, they can react to humans more intuitively and thus generate and process information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thermal release rate of nuclear reaction products was investigated in offline annealing experiments. This work was motivated by the search for a high melting catcher material for recoiling products from heavy ion induced nuclear fusion reactions. Polycrystalline refractory metal foils of Ni, Y, Zr, Nb, Mo, Hf, W, and Re were investigated as catcher metals. Diffusion data for various tracer/host combinations were deduced from the measured release rates. This work focuses on the diffusion and the release rate of volatile p-elements from row 5 and 6 of the periodic table as lighter homologues of the superheavy elements with Z ≥ 113 to be studied in future experiments. A massive radiation damage enhancement of the diffusion velocity was observed. Diffusion trends have been established along the groups and rows of the periodic table based on the dependence of diffusion velocity on atomic sizes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective. The purpose of the study is to provide a holistic depiction of behavioral & environmental factors contributing to risky sexual behaviors among predominantly high school educated, low-income African Americans residing in urban areas of Houston, TX utilizing the Theory of Gender and Power, Situational/Environmental Variables Theory, and Sexual Script Theory. Methods. A cross-sectional study was conducted via questionnaires among 215 Houston area residents, 149 were women and 66 were male. Measures used to assess behaviors of the population included a history of homelessness, use of crack/cocaine among several other illicit drugs, the type of sexual partner, age of participant, age of most recent sex partner, whether or not participants sought health care in the last 12 months, knowledge of partner's other sexual activities, symptoms of depression, and places where partner's were met. In an effort to determine risk of sexual encounters, a risk index employing the variables used to assess condom use was created categorizing sexual encounters as unsafe or safe. Results. Variables meeting the significance level of p<.15 for the bivariate analysis of each theory were entered into a binary logistic regression analysis. The block for each theory was significant, suggesting that the grouping assignments of each variable by theory were significantly associated with unsafe sexual behaviors. Within the regression analysis, variables such as sex for drugs/money, low income, and crack use demonstrated an effect size of ≥ ± 1, indicating that these variables had a significant effect on unsafe sexual behavioral practices. Conclusions. Variables assessing behavior and environment demonstrated a significant effect when categorized by relation to designated theories.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Previous results indicated that translation of four mitochondrion-encoded genes and one nucleus-encoded gene (COX4) is repressed in mutants (pgs1Delta) of Saccharomyces cerevisiae lacking phosphatidylglycerol and cardiolipin. COX4 translation was studied here using a mitochondrially targeted green fluorescence protein (mtGFP) fused to the COX4 promoter and its 5' and 3' untranslated regions (UTRs). Lack of mtGFP expression independent of carbon source and strain background was established to be at the translational level. The translational defect was not due to deficiency of mitochondrial respiratory function but was rather caused directly by the lack of phosphatidylglycerol and cardiolipin in mitochondrial membranes. Reintroduction of a functional PGS1 gene under control of the ADH1 promoter restored phosphatidylglycerol synthesis and expression of mtGFP. Deletion analysis of the 5' UTR(COX4) revealed the presence of a 50-nucleotide fragment with two stem-loops as a cis-element inhibiting COX4 translation. Binding of a protein factor(s) specifically to this sequence was observed with cytoplasm from pgs1Delta but not PGS1 cells. Using HIS3 and lacZ as reporters, extragenic spontaneous recessive mutations that allowed expression of His3p and beta-galactosidase were isolated, which appeared to be loss-of-function mutations, suggesting that the genes mutated may encode the trans factors that bind to the cis element in pgs1Delta cells.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main goal of the AEgIS experiment at CERN is to test the weak equivalence principle for antimatter. We will measure the Earth ' s gravitational acceleration g with antihydrogen atoms being launched in a horizontal vacuum tube and traversing a moiré de fl ectometer. We intend to use a position sensitive device made of nuclear emulsions (combined with a time-of- fl ight detector such as silicon μ strips) to measure precisely their annihilation points at the end of the tube. The goal is to determine g with a 1% relative accuracy. In 2012 we tested emulsion fi lms in vacuum and at room temperature with low energy antiprotons from the CERN antiproton decelerator. First results on the expected performance for AEgIS are presented

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to fully describe the construct of empowerment and to determine possible measures for this construct in racially and ethnically diverse neighborhoods, a qualitative study based on Grounded Theory was conducted at both the individual and collective levels. Participants for the study included 49 grassroots experts on community empowerment who were interviewed through semi-structured interviews and focus groups. The researcher also conducted field observations as part of the research protocol.^ The results of the study identified benchmarks of individual and collective empowerment and hundreds of possible markers of collective empowerment applicable in diverse communities. Results also indicated that community involvement is essential in the selection and implementation of proper measures. Additional findings were that the construct of empowerment involves specific principles of empowering relationships and particular motivational factors. All of these findings lead to a two dimensional model of empowerment based on the concepts of relationships among members of a collective body and the collective body's desire for socio-political change.^ These results suggest that the design, implementation, and evaluation of programs that foster empowerment must be based on collaborative ventures between the population being served and program staff because of the interactive, synergistic nature of the construct. In addition, empowering programs should embrace specific principles and processes of individual and collective empowerment in order to maximize their effectiveness and efficiency. And finally, the results suggest that collaboratively choosing markers to measure the processes and outcomes of empowerment in the main systems and populations living in today's multifaceted communities is a useful mechanism to determine change. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nuclear morphometry (NM) uses image analysis to measure features of the cell nucleus which are classified as: bulk properties, shape or form, and DNA distribution. Studies have used these measurements as diagnostic and prognostic indicators of disease with inconclusive results. The distributional properties of these variables have not been systematically investigated although much of the medical data exhibit nonnormal distributions. Measurements are done on several hundred cells per patient so summary measurements reflecting the underlying distribution are needed.^ Distributional characteristics of 34 NM variables from prostate cancer cells were investigated using graphical and analytical techniques. Cells per sample ranged from 52 to 458. A small sample of patients with benign prostatic hyperplasia (BPH), representing non-cancer cells, was used for general comparison with the cancer cells.^ Data transformations such as log, square root and 1/x did not yield normality as measured by the Shapiro-Wilks test for normality. A modulus transformation, used for distributions having abnormal kurtosis values, also did not produce normality.^ Kernel density histograms of the 34 variables exhibited non-normality and 18 variables also exhibited bimodality. A bimodality coefficient was calculated and 3 variables: DNA concentration, shape and elongation, showed the strongest evidence of bimodality and were studied further.^ Two analytical approaches were used to obtain a summary measure for each variable for each patient: cluster analysis to determine significant clusters and a mixture model analysis using a two component model having a Gaussian distribution with equal variances. The mixture component parameters were used to bootstrap the log likelihood ratio to determine the significant number of components, 1 or 2. These summary measures were used as predictors of disease severity in several proportional odds logistic regression models. The disease severity scale had 5 levels and was constructed of 3 components: extracapsulary penetration (ECP), lymph node involvement (LN+) and seminal vesicle involvement (SV+) which represent surrogate measures of prognosis. The summary measures were not strong predictors of disease severity. There was some indication from the mixture model results that there were changes in mean levels and proportions of the components in the lower severity levels. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Barry Saltzman was a giant in the fields of meteorology and climate science. A leading figure in the study of weather and climate for over 40 yr, he has frequently been referred to as the "father of modern climate theory." Ahead of his time in many ways, Saltzman made significant contributions to our understanding of the general circulation and spectral energetics budget of the atmosphere, as well as climate change across a wide spectrum of time scales. In his endeavor to develop a unified theory of how the climate system works, lie played a role in the development of energy balance models, statistical dynamical models, and paleoclimate dynamical models. He was a pioneer in developing meteorologically motivated dynamical systems, including the progenitor of Lorenz's famous chaos model. In applying his own dynamical-systems approach to long-term climate change, he recognized the potential for using atmospheric general circulation models in a complimentary way. In 1998, he was awarded the Carl-Gustaf Rossby medal, the highest honor of the American Meteorological Society "for his life-long contributions to the study of the global circulation and the evolution of the earth's climate." In this paper, the authors summarize and place into perspective some of the most significant contributions that Barry Saltzman made during his long and distinguished career. This short review also serves as an introduction to the papers in this special issue of the Journal of Climate dedicated to Barry's memory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction So far, social psychology in sport has preliminary focused on team cohesion, and many studies and meta-analyses tried to demonstrate a relation between cohesiveness of a team and its performance. How a team really co-operates and how the individual actions are integrated towards a team action is a question that has received relatively little attention in research. This may, at least in part, be due to a lack of a theoretical framework for collective actions, a dearth that has only recently begun to challenge sport psychologists. Objectives In this presentation a framework for a comprehensive theory of teams in sport is outlined and its potential to integrate research in the domain of team performance and, more specifically, the following presentations, is put up for discussion. Method Based on a model developed by von Cranach, Ochsenbein and Valach (1986), teams are considered to be information processing organisms, and team actions need to be investigated on two levels: the individual team member and the group as an entity. Elements to be considered are the task, the social structure, the information processing structure and the execution structure. Obviously, different task require different social structures, communication processes and co-ordination of individual movements. Especially in rapid interactive sports planning and execution of movements based on feedback loops are not possible. Deliberate planning may be a solution mainly for offensive actions, whereas defensive actions have to adjust to the opponent team's actions. Consequently, mental representations must be developed to allow a feed-forward regulation of team member's actions. Results and Conclusions Some preliminary findings based on this conceptual framework as well as further consequences for empirical investigations will be presented. References Cranach, M.v., Ochsenbein, G. & Valach, L. (1986). The group as a self-active system: Outline of a theory of group action. European Journal of Social Psychology, 16, 193-229.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent experiments revealed that the fruit fly Drosophila melanogaster has a dedicated mechanism for forgetting: blocking the G-protein Rac leads to slower and activating Rac to faster forgetting. This active form of forgetting lacks a satisfactory functional explanation. We investigated optimal decision making for an agent adapting to a stochastic environment where a stimulus may switch between being indicative of reward or punishment. Like Drosophila, an optimal agent shows forgetting with a rate that is linked to the time scale of changes in the environment. Moreover, to reduce the odds of missing future reward, an optimal agent may trade the risk of immediate pain for information gain and thus forget faster after aversive conditioning. A simple neuronal network reproduces these features. Our theory shows that forgetting in Drosophila appears as an optimal adaptive behavior in a changing environment. This is in line with the view that forgetting is adaptive rather than a consequence of limitations of the memory system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We develop further the effective fluid theory of stationary branes. This formalism applies to stationary blackfolds as well as to other equilibrium brane systems at finite temperature. The effective theory is described by a Lagrangian containing the information about the elastic dynamics of the brane embedding as well as the hydrodynamics of the effective fluid living on the brane. The Lagrangian is corrected order-by-order in a derivative expansion, where we take into account the dipole moment of the brane which encompasses finite-thickness corrections, including transverse spin. We describe how to extract the thermodynamics from the Lagrangian and we obtain constraints on the higher-derivative terms with one and two derivatives. These constraints follow by comparing the brane thermodynamics with the conserved currents associated with background Killing vector fields. In particular, we fix uniquely the one- and two-derivative terms describing the coupling of the transverse spin to the background space-time. Finally, we apply our formalism to two blackfold examples, the black tori and charged black rings and compare the latter to a numerically generated solution.