987 resultados para complexity theory


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The recently developed semiclassical variational Wigner-Kirkwood (VWK) approach is applied to finite nuclei using external potentials and self-consistent mean fields derived from Skyrme inter-actions and from relativistic mean field theory. VWK consist s of the Thomas-Fermi part plus a pure, perturbative h 2 correction. In external potentials, VWK passes through the average of the quantal values of the accumulated level density and total en energy as a function of the Fermi energy. However, there is a problem of overbinding when the energy per particle is displayed as a function of the particle number. The situation is analyzed comparing spherical and deformed harmonic oscillator potentials. In the self-consistent case, we show for Skyrme forces that VWK binding energies are very close to those obtained from extended Thomas-Fermi functionals of h 4 order, pointing to the rapid convergence of the VWK theory. This satisfying result, however, does not cure the overbinding problem, i.e., the semiclassical energies show more binding than they should. This feature is more pronounced in the case of Skyrme forces than with the relativistic mean field approach. However, even in the latter case the shell correction energy for e.g.208 Pb turns out to be only ∼ −6 MeV what is about a factor two or three off the generally accepted value. As an adhoc remedy, increasing the kinetic energy by 2.5%, leads to shell correction energies well acceptable throughout the periodic table. The general importance of the present studies for other finite Fermi systems, self-bound or in external potentials, is pointed out.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We perform Hartree calculations of symmetric and asymmetric semi-infinite nuclear matter in the framework of relativistic models based on effective hadronic field theories as recently proposed in the literature. In addition to the conventional cubic and quartic scalar self-interactions, the extended models incorporate a quartic vector self-interaction, scalar-vector non-linearities and tensor couplings of the vector mesons. We investigate the implications of these terms on nuclear surface properties such as the surface energy coefficient, surface thickness, surface stiffness coefficient, neutron skin thickness and the spin-orbit force.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

By means of computer simulations and solution of the equations of the mode coupling theory (MCT),we investigate the role of the intramolecular barriers on several dynamic aspects of nonentangled polymers. The investigated dynamic range extends from the caging regime characteristic of glass-formers to the relaxation of the chain Rouse modes. We review our recent work on this question,provide new results, and critically discuss the limitations of the theory. Solutions of the MCT for the structural relaxation reproduce qualitative trends of simulations for weak and moderate barriers. However, a progressive discrepancy is revealed as the limit of stiff chains is approached. This dis-agreement does not seem related with dynamic heterogeneities, which indeed are not enhanced by increasing barrier strength. It is not connected either with the breakdown of the convolution approximation for three-point static correlations, which retains its validity for stiff chains. These findings suggest the need of an improvement of the MCT equations for polymer melts. Concerning the relaxation of the chain degrees of freedom, MCT provides a microscopic basis for time scales from chain reorientation down to the caging regime. It rationalizes, from first principles, the observed deviations from the Rouse model on increasing the barrier strength. These include anomalous scaling of relaxation times, long-time plateaux, and nonmonotonous wavelength dependence of the mode correlators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Työn tavoitteena oli arvioida ja vertailla kaupallisia sovelluskehyksiä. Lähtökohtana oli TietoEnatorin Telecom-yksikön sovelluskehityksen vaatimukset ja tarpeet. Ensin selvitettiin sovelluskehysten perusteoriaa ja käsitteistöä kirjallisuuden avulla. Arviointikriteerit valittiin liiketoiminnallisten ja teknisten tekijöiden perusteella. Tuotteiden vertailussa otettiin huomioon myös Telecom-yksikön asiakaskunnan tarpeet ja vaatimukset. Vertailu tehtiin laajennettavaan päätöspuuhun pohjautuvalla menetelmällä. Lopuksi arvioitiin tutkimuksessa käytettyä vertailumenetelmää.Sovelluskehysmarkkinoilla on tarjolla tällä hetkellä lukuisia erilaisia kaupallisia sovelluskehyksiä. Sovellusten kehittäjät rakentavat räätälöityjä sovelluksia sovelluskehysten runkojen ja uudelleenkäytettävien elementtien avulla. Sovelluskehyksen kohdealue voi rajoittua tiettyyn toimialaan tai vaihtoehtoisesti se voi olla yleinen, horisontaalisesti laajemman alueen kattava. Valikoima on myös huomattava teknisten ominaisuuksien osalta. Ongelmalliseksi sovelluskehysten vertailun tekeekin niiden monimuotoisuus ja käyttökohteiden laajuus. Sovelluskehyksiä voidaan kuitenkin käyttää tehokkaasti perinteisemmän ohjelmistokehityksen vaihtoehtona, mikäli valmista ohjelmarunkoa ja uudelleenkäytettäviä osia osataan oikein hyödyntää.Vertailun tuloksena saatiin ehdokaslistalle kaksi sovelluskehystä, joiden käyttöä voidaan suositella Telecom-yksikölle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increase of publicly available sequencing data has allowed for rapid progress in our understanding of genome composition. As new information becomes available we should constantly be updating and reanalyzing existing and newly acquired data. In this report we focus on transposable elements (TEs) which make up a significant portion of nearly all sequenced genomes. Our ability to accurately identify and classify these sequences is critical to understanding their impact on host genomes. At the same time, as we demonstrate in this report, problems with existing classification schemes have led to significant misunderstandings of the evolution of both TE sequences and their host genomes. In a pioneering publication Finnegan (1989) proposed classifying all TE sequences into two classes based on transposition mechanisms and structural features: the retrotransposons (class I) and the DNA transposons (class II). We have retraced how ideas regarding TE classification and annotation in both prokaryotic and eukaryotic scientific communities have changed over time. This has led us to observe that: (1) a number of TEs have convergent structural features and/or transposition mechanisms that have led to misleading conclusions regarding their classification, (2) the evolution of TEs is similar to that of viruses by having several unrelated origins, (3) there might be at least 8 classes and 12 orders of TEs including 10 novel orders. In an effort to address these classification issues we propose: (1) the outline of a universal TE classification, (2) a set of methods and classification rules that could be used by all scientific communities involved in the study of TEs, and (3) a 5-year schedule for the establishment of an International Committee for Taxonomy of Transposable Elements (ICTTE).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study presents an innovative methodology for forensic science image analysis for event reconstruction. The methodology is based on experiences from real cases. It provides real added value to technical guidelines such as standard operating procedures (SOPs) and enriches the community of practices at stake in this field. This bottom-up solution outlines the many facets of analysis and the complexity of the decision-making process. Additionally, the methodology provides a backbone for articulating more detailed and technical procedures and SOPs. It emerged from a grounded theory approach; data from individual and collective interviews with eight Swiss and nine European forensic image analysis experts were collected and interpreted in a continuous, circular and reflexive manner. Throughout the process of conducting interviews and panel discussions, similarities and discrepancies were discussed in detail to provide a comprehensive picture of practices and points of view and to ultimately formalise shared know-how. Our contribution sheds light on the complexity of the choices, actions and interactions along the path of data collection and analysis, enhancing both the researchers' and participants' reflexivity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The article presents a discussion of foundational issues in the field of management science, focusing on advances in management theory and research. The metaphor of explanatory lenses is used as a rubric to illustrate the theoretical challenges involved in elucidating the interrelationships of various factors in organizational behavior. The importance of clarifying such interrelationships is emphasized, from the standpoint of editing scholarly papers on such topics for publication. Topics discussed include communication and psychology in management, economics, and behavioral finance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Maximum entropy modeling (Maxent) is a widely used algorithm for predicting species distributions across space and time. Properly assessing the uncertainty in such predictions is non-trivial and requires validation with independent datasets. Notably, model complexity (number of model parameters) remains a major concern in relation to overfitting and, hence, transferability of Maxent models. An emerging approach is to validate the cross-temporal transferability of model predictions using paleoecological data. In this study, we assess the effect of model complexity on the performance of Maxent projections across time using two European plant species (Alnus giutinosa (L.) Gaertn. and Corylus avellana L) with an extensive late Quaternary fossil record in Spain as a study case. We fit 110 models with different levels of complexity under present time and tested model performance using AUC (area under the receiver operating characteristic curve) and AlCc (corrected Akaike Information Criterion) through the standard procedure of randomly partitioning current occurrence data. We then compared these results to an independent validation by projecting the models to mid-Holocene (6000 years before present) climatic conditions in Spain to assess their ability to predict fossil pollen presence-absence and abundance. We find that calibrating Maxent models with default settings result in the generation of overly complex models. While model performance increased with model complexity when predicting current distributions, it was higher with intermediate complexity when predicting mid-Holocene distributions. Hence, models of intermediate complexity resulted in the best trade-off to predict species distributions across time. Reliable temporal model transferability is especially relevant for forecasting species distributions under future climate change. Consequently, species-specific model tuning should be used to find the best modeling settings to control for complexity, notably with paleoecological data to independently validate model projections. For cross-temporal projections of species distributions for which paleoecological data is not available, models of intermediate complexity should be selected.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present computer simulations of a simple bead-spring model for polymer melts with intramolecular barriers. By systematically tuning the strength of the barriers, we investigate their role on the glass transition. Dynamic observables are analyzed within the framework of the mode coupling theory (MCT). Critical nonergodicity parameters, critical temperatures, and dynamic exponents are obtained from consistent fits of simulation data to MCT asymptotic laws. The so-obtained MCT λ-exponent increases from standard values for fully flexible chains to values close to the upper limit for stiff chains. In analogy with systems exhibiting higher-order MCT transitions, we suggest that the observed large λ-values arise form the interplay between two distinct mechanisms for dynamic arrest: general packing effects and polymer-specific intramolecular barriers. We compare simulation results with numerical solutions of the MCT equations for polymer systems, within the polymer reference interaction site model (PRISM) for static correlations. We verify that the approximations introduced by the PRISM are fulfilled by simulations, with the same quality for all the range of investigated barrier strength. The numerical solutions reproduce the qualitative trends of simulations for the dependence of the nonergodicity parameters and critical temperatures on the barrier strength. In particular, the increase in the barrier strength at fixed density increases the localization length and the critical temperature. However the qualitative agreement between theory and simulation breaks in the limit of stiff chains. We discuss the possible origin of this feature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyses the effects of manipulating the cognitive complexity of L2 oral tasks on language production. It specifically focuses on self-repairs, which are taken as a measure of accuracy since they denote both attention to form and an attempt at being accurate. By means of a repeated measures de- sign, 42 lower-intermediate students were asked to perform three different tasks types (a narrative, and instruction-giving task, and a decision-making task) for which two degrees of cognitive complexity were established. The narrative task was manipulated along +/− Here-and-Now, an instruction-giving task ma- nipulated along +/− elements, and the decision-making task which is manipu- lated along +/− reasoning demands. Repeated measures ANOVAs are used for the calculation of differences between degrees of complexity and among task types. One-way ANOVA are used to detect potential differences between low- proficiency and high-proficiency participants. Results show an overall effect of Task Complexity on self-repairs behavior across task types, with different be- haviors existing among the three task types. No differences are found between the self-repair behavior between low and high proficiency groups. Results are discussed in the light of theories of cognition and L2 performance (Robin- son 2001a, 2001b, 2003, 2005, 2007), L1 and L2 language production models (Levelt 1989, 1993; Kormos 2000, 2006), and attention during L2 performance (Skehan 1998; Robinson, 2002).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis deals with combinatorics, order theory and descriptive set theory. The first contribution is to the theory of well-quasi-orders (wqo) and better-quasi-orders (bqo). The main result is the proof of a conjecture made by Maurice Pouzet in 1978 his thèse d'état which states that any wqo whose ideal completion remainder is bqo is actually bqo. Our proof relies on new results with both a combinatorial and a topological flavour concerning maps from a front into a compact metric space. The second contribution is of a more applied nature and deals with topological spaces. We define a quasi-order on the subsets of every second countable To topological space in a way that generalises the Wadge quasi-order on the Baire space, while extending its nice properties to virtually all these topological spaces. The Wadge quasi-order of reducibility by continuous functions is wqo on Borei subsets of the Baire space, this quasi-order is however far less satisfactory for other important topological spaces such as the real line, as Hertling, Ikegami and Schlicht notably observed. Some authors have therefore studied reducibility with respect to some classes of discontinuous functions to remedy this situation. We propose instead to keep continuity but to weaken the notion of function to that of relation. Using the notion of admissible representation studied in Type-2 theory of effectivity, we define the quasi-order of reducibility by relatively continuous relations. We show that this quasi-order both refines the classical hierarchies of complexity and is wqo on the Borei subsets of virtually every second countable To space - including every (quasi-)Polish space. -- Cette thèse se situe dans les domaines de la combinatoire, de la théorie des ordres et de la théorie descriptive. La première contribution concerne la théorie des bons quasi-ordres (wqo) et des meilleurs quasi-ordres (bqo). Le résultat principal est la preuve d'une conjecture, énoncée par Pouzet en 1978 dans sa thèse d'état, qui établit que tout wqo dont l'ensemble des idéaux non principaux ordonnés par inclusion forme un bqo est alors lui-même un bqo. La preuve repose sur de nouveaux résultats, qui allient la combinatoire et la topologie, au sujet des fonctions d'un front vers un espace métrique compact. La seconde contribution de cette thèse traite de la complexité topologique dans le cadre des espaces To à base dénombrable. Dans le cas de l'espace de Baire, le quasi-ordre de Wadge est un wqo sur les sous-ensembles Boréliens qui a suscité énormément d'intérêt. Cependant cette relation de réduction par fonctions continues s'avère bien moins satisfaisante pour d'autres espaces d'importance tels que la droite réelle, comme l'ont fait notamment remarquer Hertling, Schlicht et Ikegami. Nous proposons de conserver la continuité et d'affaiblir la notion de fonction pour celle de relation. Pour ce faire, nous utilisons la notion de représentation admissible étudiée en « Type-2 theory of effectivity » initiée par Weihrauch. Nous introduisons alors le quasi-ordre de réduction par relations relativement continues et montrons que celui-ci à la fois raffine les hiérarchies classiques de complexité topologique et forme un wqo sur les sous-ensembles Boréliens de chaque espace quasi-Polonais.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Informaatiotulva ja organisaation monimutkaisuus luoneet tarpeen tietämyksen hallinnalle. Tämän tutkimuksen tavoitteena on tunnistaa muutostarpeet, jotka portaalin käyttöönotto tietämyksenhallintatyökaluna luo. Tutkimuksessa verrataan myös uusia työkaluja olemassa oleviin sekä arvioidaan organisaation kykyä siirtää tietämystä virtuaalisesti. Kirjallisuutta vastaavanlaisista projekteista ei ole ollut saatavilla, sillä käyttöönotettava teknologia on melko uutta. Samaa teknologiaa on käytössä hieman eri alueella, kuin tässä projektissa on tavoitteena. Tutkimus on tapaustutkimus, jonka pääasialliset lähteet ovat erilaisissa kokouksissa tuotettuja dokumentteja. Tutkija on osallistunut aktiivisesti projektityöhön, joten osa taustatiedoista perustuu tutkijan huomioihin sekä vielä keskusteluihin. Teoriaosassa käsitellään tietämyksen jakamista tietämyksen hallinnan ja virtuaalisuuden näkökulmasta. Muutoksen hallintaa on käsitelty lyhyesti tietämyksenhallintatyökalun käyttöönotossa. Tutkimus liittyy Stora Enso Consumer Boardsin tietämyksen hallintaprojektiin.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Experimental animal models are essential to obtain basic knowledge of the underlying biological mechanisms in human diseases. Here, we review major contributions to biomedical research and discoveries that were obtained in the mouse model by using forward genetics approaches and that provided key insights into the biology of human diseases and paved the way for the development of novel therapeutic approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analyze the results for infinite nuclear and neutron matter using the standard relativistic mean field model and its recent effective field theory motivated generalization. For the first time, we show quantitatively that the inclusion in the effective theory of vector meson self-interactions and scalar-vector cross-interactions explains naturally the recent experimental observations of the softness of the nuclear equation of state, without losing the advantages of the standard relativistic model for finite nuclei.