984 resultados para Geriatric Core Set


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first theoretical results of core-valence correlation effects are presented for the infrared wavenumbers and intensities of the BF3 and BCl3 molecules, using (double- and triple-zeta) Dunning core-valence basis sets at the CCSD(T) level. The results are compared with those calculated in the frozen core approximation with standard Dunning basis sets at the same correlation level and with the experimental values. The general conclusion is that the effect of core-valence correlation is, for infrared wavenumbers and intensities, smaller than the effect of adding augmented diffuse functions to the basis set, e.g., cc-pVTZ to aug-cc-pVTZ. Moreover, the trends observed in the data are mainly related to the augmented functions rather than the core-valence functions added to the basis set. The results obtained here confirm previous studies pointing out the large descrepancy between the theoretical and experimental intensities of the stretching mode for BCl3.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Microsatellites and gene-derived markers are still underrepresented in the core molecular linkage map of common bean compared to other types of markers. In order to increase the density of the core map, a set of new markers were developed and mapped onto the RIL population derived from the `BAT93` x `Jalo EEP558` cross. The EST-SSR markers were first characterized using a set of 24 bean inbred lines. On average, the polymorphism information content was 0.40 and the mean number of alleles per locus was 2.7. In addition, AFLP and RGA markers based on the NBS-profiling method were developed and a subset of the mapped RGA was sequenced. With the integration of 282 new markers into the common bean core map, we were able to place markers with putative known function in some existing gaps including regions with QTL for resistance to anthracnose and rust. The distribution of the markers over 11 linkage groups is discussed and a newer version of the common bean core linkage map is proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since collaborative networked organisations are usually formed by independent and heterogeneous entities, it is natural that each member holds his own set of values, and that conflicts among partners might emerge because of some misalignment of values. In contrast, it is often stated in literature that the alignment between the value systems of members involved in collaborative processes is a prerequisite for successful co-working. As a result, the issue of core value alignment in collaborative networks started to attract attention. However, methods to analyse such alignment are lacking mainly because the concept of 'alignment' in this context is still ill defined and shows a multifaceted nature. As a contribution to the area, this article introduces an approach based on causal models and graph theory for the analysis of core value alignment in collaborative networks. The potential application of the approach is then discussed in the virtual organisations' breeding environment context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Collaborative networks are typically formed by heterogeneous and autonomous entities, and thus it is natural that each member has its own set of core-values. Since these values somehow drive the behaviour of the involved entities, the ability to quickly identify partners with compatible or common core-values represents an important element for the success of collaborative networks. However, tools to assess or measure the level of alignment of core-values are lacking. Since the concept of 'alignment' in this context is still ill-defined and shows a multifaceted nature, three perspectives are discussed. The first one uses a causal maps approach in order to capture, structure, and represent the influence relationships among core-values. This representation provides the basis to measure the alignment in terms of the structural similarity and influence among value systems. The second perspective considers the compatibility and incompatibility among core-values in order to define the alignment level. Under this perspective we propose a fuzzy inference system to estimate the alignment level, since this approach allows dealing with variables that are vaguely defined, and whose inter-relationships are difficult to define. Another advantage provided by this method is the possibility to incorporate expert human judgment in the definition of the alignment level. The last perspective uses a belief Bayesian network method, and was selected in order to assess the alignment level based on members' past behaviour. An example of application is presented where the details of each method are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent integrated circuit technologies have opened the possibility to design parallel architectures with hundreds of cores on a single chip. The design space of these parallel architectures is huge with many architectural options. Exploring the design space gets even more difficult if, beyond performance and area, we also consider extra metrics like performance and area efficiency, where the designer tries to design the architecture with the best performance per chip area and the best sustainable performance. In this paper we present an algorithm-oriented approach to design a many-core architecture. Instead of doing the design space exploration of the many core architecture based on the experimental execution results of a particular benchmark of algorithms, our approach is to make a formal analysis of the algorithms considering the main architectural aspects and to determine how each particular architectural aspect is related to the performance of the architecture when running an algorithm or set of algorithms. The architectural aspects considered include the number of cores, the local memory available in each core, the communication bandwidth between the many-core architecture and the external memory and the memory hierarchy. To exemplify the approach we did a theoretical analysis of a dense matrix multiplication algorithm and determined an equation that relates the number of execution cycles with the architectural parameters. Based on this equation a many-core architecture has been designed. The results obtained indicate that a 100 mm(2) integrated circuit design of the proposed architecture, using a 65 nm technology, is able to achieve 464 GFLOPs (double precision floating-point) for a memory bandwidth of 16 GB/s. This corresponds to a performance efficiency of 71 %. Considering a 45 nm technology, a 100 mm(2) chip attains 833 GFLOPs which corresponds to 84 % of peak performance These figures are better than those obtained by previous many-core architectures, except for the area efficiency which is limited by the lower memory bandwidth considered. The results achieved are also better than those of previous state-of-the-art many-cores architectures designed specifically to achieve high performance for matrix multiplication.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sandwich structures with soft cores are widely used in applications where a high bending stiffness is required without compromising the global weight of the structure, as well as in situations where good thermal and damping properties are important parameters to observe. As equivalent single layer approaches are not the more adequate to describe realistically the kinematics and the stresses distributions as well as the dynamic behaviour of this type of sandwiches, where shear deformations and the extensibility of the core can be very significant, layerwise models may provide better solutions. Additionally and in connection with this multilayer approach, the selection of different shear deformation theories according to the nature of the material that constitutes the core and the outer skins can predict more accurately the sandwich behaviour. In the present work the authors consider the use of different shear deformation theories to formulate different layerwise models, implemented through kriging-based finite elements. The viscoelastic material behaviour, associated to the sandwich core, is modelled using the complex approach and the dynamic problem is solved in the frequency domain. The outer elastic layers considered in this work may also be made from different nanocomposites. The performance of the models developed is illustrated through a set of test cases. (C) 2015 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distributed real-time systems such as automotive applications are becoming larger and more complex, thus, requiring the use of more powerful hardware and software architectures. Furthermore, those distributed applications commonly have stringent real-time constraints. This implies that such applications would gain in flexibility if they were parallelized and distributed over the system. In this paper, we consider the problem of allocating fixed-priority fork-join Parallel/Distributed real-time tasks onto distributed multi-core nodes connected through a Flexible Time Triggered Switched Ethernet network. We analyze the system requirements and present a set of formulations based on a constraint programming approach. Constraint programming allows us to express the relations between variables in the form of constraints. Our approach is guaranteed to find a feasible solution, if one exists, in contrast to other approaches based on heuristics. Furthermore, approaches based on constraint programming have shown to obtain solutions for these type of formulations in reasonable time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lipid nanoballoons integrating multiple emulsions of the type water-in-oil-in-water enclose, at least in theory, a biomimetic aqueous-core suitable for housing hydrophilic biomolecules such as proteins, peptides and bacteriophage particles. The research effort entertained in this paper reports a full statistical 23x31 factorial design study (three variables at two levels and one variable at three levels) to optimize biomimetic aqueous-core lipid nanoballoons for housing hydrophilic protein entities. The concentrations of protein, lipophilic and hydrophilic emulsifiers, and homogenization speed were set as the four independent variables, whereas the mean particle hydrodynamic size (HS), zeta potential (ZP) and polydispersity index (PI) were set as the dependent variables. The V23x31 factorial design constructed led to optimization of the higher (+1) and lower (-1) levels, with triplicate testing for the central (0) level, thus producing thirty three experiments and leading to selection of the optimized processing parameters as 0.015% (w/w) protein entity, 0.75% (w/w) lipophilic emulsifier (soybean lecithin) and 0.50% (w/w) hydrophilic emulsifier (poloxamer 188). In the present research effort, statistical optimization and production of protein derivatives encompassing full stabilization of their three-dimensional structure, has been attempted via housing said molecular entities within biomimetic aqueous-core lipid nanoballoons integrating a multiple (W/O/W) emulsion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article presents and explores the axioms and core ideas, or idées-force, of the Fascist ideologies of the first third of the twentieth century. The aim is to identify the features that define the term “Classical Fascism” as a conceptual category in the study of politics and to uncover the core ideas of its political theory. This analysis requires an appraisal of both the idées-force themselves and the political use that is made of them. If these appreciations are correct, Classical Fascism is characterized by a set of ideological and political aims and methods in which ideas, attitudes and behaviours are determined by an anti-democratic palingenetic ultranationalism underpinned by a sacralized ideology; the quest for a united, indissoluble society as apolitical system and, at the same time, the collective myth that mobilizes and redeems the nation; and third, violence as a political vehicle applied unchecked against internal opposition and against external enemies who challenge the nation´s progression towards the dream of rebirth and the culmination of this progression in the form of an empire.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study two cooperative solutions of a market with indivisible goods modeled as a generalized assignment game: Set-wise stability and Core. We first establish that the Set-wise stable set is contained in the Core and it contains the non-empty set of competitive equilibrium payoffs. We then state and prove three limit results for replicated markets. First, the sequence of Cores of replicated markets converges to the set of competitive equilibrium payoffs when the number of replicas tends to infinity. Second, the Set-wise stable set of a two-fold replicated market already coincides with the set of competitive equilibrium payoffs. Third, for any number of replicas there is a market with a Core payoff that is not a competitive equilibrium payoff.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this paper is to measure the returns to human capital. We use a unique data set consisting of matched employer-employee information. Data on individuals' human capital include a set of 26 competences that capture the utilization of workers' skills in a very detailed way. Thus, we can expand the concept of human capital and discuss the type of skills that are more productive in the workplace and, hence, generate a higher payoff for the workers. The rich information on firm's and workplace characteristics allows us to introduce a broad range of controls and to improve previous research in this field. This paper gives evidence that the returns to generic competences differ depending on the position of the worker in the firm. Only numeracy skills are reward independent of the occupational status of the worker. The level of technology used by the firm in the production process does not directly increase workers’ pay, but it influences the pay-off to some of the competences. JEL Classification: J24, J31

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To obtain a state-of-the-art benchmark potential energy surface (PES) for the archetypal oxidative addition of the methane C-H bond to the palladium atom, we have explored this PES using a hierarchical series of ab initio methods (Hartree-Fock, second-order Møller-Plesset perturbation theory, fourth-order Møller-Plesset perturbation theory with single, double and quadruple excitations, coupled cluster theory with single and double excitations (CCSD), and with triple excitations treated perturbatively [CCSD(T)]) and hybrid density functional theory using the B3LYP functional, in combination with a hierarchical series of ten Gaussian-type basis sets, up to g polarization. Relativistic effects are taken into account either through a relativistic effective core potential for palladium or through a full four-component all-electron approach. Counterpoise corrected relative energies of stationary points are converged to within 0.1-0.2 kcal/mol as a function of the basis-set size. Our best estimate of kinetic and thermodynamic parameters is -8.1 (-8.3) kcal/mol for the formation of the reactant complex, 5.8 (3.1) kcal/mol for the activation energy relative to the separate reactants, and 0.8 (-1.2) kcal/mol for the reaction energy (zero-point vibrational energy-corrected values in parentheses). This agrees well with available experimental data. Our work highlights the importance of sufficient higher angular momentum polarization functions, f and g, for correctly describing metal-d-electron correlation and, thus, for obtaining reliable relative energies. We show that standard basis sets, such as LANL2DZ+ 1f for palladium, are not sufficiently polarized for this purpose and lead to erroneous CCSD(T) results. B3LYP is associated with smaller basis set superposition errors and shows faster convergence with basis-set size but yields relative energies (in particular, a reaction barrier) that are ca. 3.5 kcal/mol higher than the corresponding CCSD(T) values

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The set covering problem is an NP-hard combinatorial optimization problemthat arises in applications ranging from crew scheduling in airlines todriver scheduling in public mass transport. In this paper we analyze searchspace characteristics of a widely used set of benchmark instances throughan analysis of the fitness-distance correlation. This analysis shows thatthere exist several classes of set covering instances that have a largelydifferent behavior. For instances with high fitness distance correlation,we propose new ways of generating core problems and analyze the performanceof algorithms exploiting these core problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The monotonic core of a cooperative game with transferable utility (T.U.-game) is the set formed by all its Population Monotonic Allocation Schemes. In this paper we show that this set always coincides with the core of a certain game associated to the initial game.