927 resultados para Capability Hierarchy


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The miniaturization race in the hardware industry aiming at continuous increasing of transistor density on a die does not bring respective application performance improvements any more. One of the most promising alternatives is to exploit a heterogeneous nature of common applications in hardware. Supported by reconfigurable computation, which has already proved its efficiency in accelerating data intensive applications, this concept promises a breakthrough in contemporary technology development. Memory organization in such heterogeneous reconfigurable architectures becomes very critical. Two primary aspects introduce a sophisticated trade-off. On the one hand, a memory subsystem should provide well organized distributed data structure and guarantee the required data bandwidth. On the other hand, it should hide the heterogeneous hardware structure from the end-user, in order to support feasible high-level programmability of the system. This thesis work explores the heterogeneous reconfigurable hardware architectures and presents possible solutions to cope the problem of memory organization and data structure. By the example of the MORPHEUS heterogeneous platform, the discussion follows the complete design cycle, starting from decision making and justification, until hardware realization. Particular emphasis is made on the methods to support high system performance, meet application requirements, and provide a user-friendly programmer interface. As a result, the research introduces a complete heterogeneous platform enhanced with a hierarchical memory organization, which copes with its task by means of separating computation from communication, providing reconfigurable engines with computation and configuration data, and unification of heterogeneous computational devices using local storage buffers. It is distinguished from the related solutions by distributed data-flow organization, specifically engineered mechanisms to operate with data on local domains, particular communication infrastructure based on Network-on-Chip, and thorough methods to prevent computation and communication stalls. In addition, a novel advanced technique to accelerate memory access was developed and implemented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aims at providing a theoretical framework encompassing the two approaches towards entrepreneurial opportunity (opportunity discovery and opportunity creation) by outlining a trajectory from firm creation to capability development, to firm performance in the short term (firm survival) and the medium/long term (growth rate). A set of empirically testable hypotheses is proposed and tested by performing qualitative analyses on interviews on a small sample of entrepreneurs and event history analysis on a large sample of firms founded in the United States in 2004.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present research aims at shedding light on the demanding puzzle characterizing the issue of child undernutrition in India. Indeed, the so called ‘Indian development paradox’ identifies the phenomenon according to which higher level of income per capita is recorded alongside a lethargic reduction in the proportion of underweight children aged below three years. Thus, in the time period occurring from 2000 to 2005, real Gross Domestic Production per capita has annually grown at 5.4%, whereas the proportion of children who are underweight has declined from 47% to 46%, a mere one point percent. Such trend opens up the space for discussing the traditionally assumed linkage between income-poverty and undernutrition as well as food intervention as the main focus of policies designed to fight child hunger. Also, it unlocks doors for evaluating the role of an alternative economic approach aiming at explaining undernutrition, such as the Capability Approach. The Capability Approach argues for widening the informational basis to account not only for resources, but also for variables related to liberties, opportunities and autonomy in pursuing what individuals value.The econometric analysis highlights the relevance of including behavioral factors when explaining child undernutrition. In particular, the ability of the mother to move freely in the community without the need of asking permission to her husband or mother-in-law is statistically significant when included in the model, which accounts also for confounding traditional variables, such as economic wealth and food security. Also, focusing on agency, results indicates the necessity of measuring autonomy in different domains and the need of improving the measurement scale for agency data, especially with regards the domain of household duties. Finally, future research is required to investigate policy venues for increasing agency in women and in the communities they live in as viable strategy for reducing the plague of child undernutrition in India.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Laser shock peening is a technique similar to shot peening that imparts compressive residual stresses in materials for improving fatigue resistance. The ability to use a high energy laser pulse to generate shock waves, inducing a compressive residual stress field in metallic materials, has applications in multiple fields such as turbo-machinery, airframe structures, and medical appliances. The transient nature of the LSP phenomenon and the high rate of the laser's dynamic make real time in-situ measurement of laser/material interaction very challenging. For this reason and for the high cost of the experimental tests, reliable analytical methods for predicting detailed effects of LSP are needed to understand the potential of the process. Aim of this work has been the prediction of residual stress field after Laser Peening process by means of Finite Element Modeling. The work has been carried out in the Stress Methods department of Airbus Operations GmbH (Hamburg) and it includes investigation on compressive residual stresses induced by Laser Shock Peening, study on mesh sensitivity, optimization and tuning of the model by using physical and numerical parameters, validation of the model by comparing it with experimental results. The model has been realized with Abaqus/Explicit commercial software starting from considerations done on previous works. FE analyses are “Mesh Sensitive”: by increasing the number of elements and by decreasing their size, the software is able to probe even the details of the real phenomenon. However, these details, could be only an amplification of real phenomenon. For this reason it was necessary to optimize the mesh elements' size and number. A new model has been created with a more fine mesh in the trough thickness direction because it is the most involved in the process deformations. This increment of the global number of elements has been paid with an "in plane" size reduction of the elements far from the peened area in order to avoid too high computational costs. Efficiency and stability of the analyses has been improved by using bulk viscosity coefficients, a merely numerical parameter available in Abaqus/Explicit. A plastic rate sensitivity study has been also carried out and a new set of Johnson Cook's model coefficient has been chosen. These investigations led to a more controllable and reliable model, valid even for more complex geometries. Moreover the study about the material properties highlighted a gap of the model about the simulation of the surface conditions. Modeling of the ablative layer employed during the real process has been used to fill this gap. In the real process ablative layer is a super thin sheet of pure aluminum stuck on the masterpiece. In the simulation it has been simply reproduced as a 100µm layer made by a material with a yield point of 10MPa. All those new settings has been applied to a set of analyses made with different geometry models to verify the robustness of the model. The calibration of the model with the experimental results was based on stress and displacement measurements carried out on the surface and in depth as well. The good correlation between the simulation and experimental tests results proved this model to be reliable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mapping and monitoring are believed to provide an early warning sign to determine when to stop tumor removal to avoid mechanical damage to the corticospinal tract (CST). The objective of this study was to systematically compare subcortical monopolar stimulation thresholds (1-20 mA) with direct cortical stimulation (DCS)-motor evoked potential (MEP) monitoring signal abnormalities and to correlate both with new postoperative motor deficits. The authors sought to define a mapping threshold and DCS-MEP monitoring signal changes indicating a minimal safe distance from the CST.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Class II cavities were prepared in extracted lower molars filled and cured in three 2-mm increments using a metal matrix. Three composites (Spectrum TPH A4, Ceram X mono M7 and Tetric Ceram A4) were cured with both the SmartLite PS LED LCU and the Spectrum 800 continuous cure halogen LCU using curing cycles of 10, 20 and 40 seconds. Each increment was cured before adding the next. After a seven-day incubation period, the composite specimens were removed from the teeth, embedded in self-curing resin and ground to half the orofacial width. Knoop microhardness was determined 100, 200, 500, 1000, 1500, 2500, 3500, 4500 and 5500 microm from the occlusal surface at a distance of 150 microm and 1000 microm from the metal matrix. The total degree of polymerization of a composite specimen for any given curing time and curing light was determined by calculating the area under the hardness curve. Hardness values 150 microm from the metal matrix never reached maximum values and were generally lower than those 1000 microm from the matrix. The hardest composite was usually encountered between 200 microm and 1000 microm from the occlusal surface. For every composite-curing time combination, there was an increase in microhardness at the top of each increment (measurements at 500, 2500 and 4500 microm) and a decrease towards the bottom of each increment (measurements at 1500, 3500 and 5500 microm). Longer curing times were usually combined with harder composite samples. Spectrum TPH composite was the only composite showing a satisfactory degree of polymerization for all three curing times and both LCUs. Multiple linear regression showed that only the curing time (p < 0.001) and composite material (p < 0.001) had a significant association with the degree of polymerization. The degree of polymerization achieved by the LED LCU was not significantly different from that achieved by the halogen LCU (p = 0.54).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gesundheitliches Wohlergehen geht unmittelbar auf das aktive Zutun von Personen und Kollektiven zurück. Zugleich wird gesundheitsrelevantes Handeln nur unter der Berücksichtigung der jeweiligen physischen und sozialen Kontexte zu verstehen und zu beeinflussen sein. Dementsprechend wird hier eine Ausrichtung der modernen Gesundheitsforderung auf die gesundheitsrelevanten Ressourcen und Handlungsspielräume der Menschen vorgeschlagen. Hierfür sind theoretische Grundlagen erforderlich, die die Fragen der sozialen Ungleichheit mit den praktischen Zugängen des Empowerment und der Partizipation schlüssig verbinden. Die Autoren stellen dazu den Capability-Approach (CA) von Amartya Sen vor und ergänzsa diesen mit Erkenntnissen aus der Kapital-Interaktionstheorie von Pierre Bourdieu. Beide Ansätze beleuchten Grundfragen der sozialen Ungleichheit und können mit den Leitkonzepten der Ottawa-Charta fruchtbar verbunden werden. Sie liefern damit auch Anleitungen für neue Forschungsrichtungen zur Untersuchung der komplexen Wechselwirkungen von sozialen Kontexten und gesundheitsrelevantem Handeln.