971 resultados para traditional design


Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the concept of renewable energy becomes increasingly important in the modern society, a considerable amount of research has been conducted in the field of organic photovoltaics in recent years. Although organic solar cells generally have had lower efficiencies compared to silicon solar cells, they have the potential to be mass produced via solution processing. A common polymer solar cell architecture relies on the usage of P3HT (electron donor) and PCBM (electron acceptor) bulk heterojunction. One of the main issues with this configuration is that in order to compensate for the high exciton recombination rate, the photoactive layer is often made very thin (on the order of 100 $%). This results in low solar cell photocurrents due to low absorption. This thesis investigates a novel method of light trapping by coupling surface plasmons at the electrode interface via surface relief gratings, leading to EM field enhancements and increased photo absorption. Experimental work was first conducted on developing and optimizing a transparent electrode of the form &'()/+,/&'() to replace the traditional ITO electrode since the azopolymer gratings cannot withstand the high temperature processing of ITO films. It was determined that given the right thickness profiles and deposition conditions, the MAM stack can achieve transmittance and conductivity similar to ITO films. Experimental work was also conducted on the fabrication and characterization of surface relief gratings, as well as verification of the surface plasmon generation. Surface relief gratings were fabricated easily and accurately via laser interference lithography on photosensitive azopolymer films. Laser diffraction studies confirmed the grating pitch, which is dependent on the incident angle and wavelength of the writing beam. AFM experiments were conducted to determine the surface morphology of the gratings, before and after metallic film deposition. It was concluded that metallic film deposition does not significantly alter the grating morphologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The last couple of years there has been a lot of attention for MOOCs. More and more universities start offering MOOCs. Although the open dimension of MOOC indicates that it is open in every aspect, in most cases it is a course with a structure and a timeline within which learning activities are positioned. There is a contradiction there. The open aspect puts MOOCs more in the non-formal professional learning domain, while the course structure takes it into the formal, traditional education domain. Accordingly, there is no consensus yet on solid pedagogical approaches for MOOCs. Something similar can be said for learning analytics, another upcoming concept that is receiving a lot of attention. Given its nature, learning analytics offers a large potential to support learners in particular in MOOCs. Learning analytics should then be applied to assist the learners and teachers in understanding the learning process and could predict learning, provide opportunities for pro-active feedback, but should also results in interventions aimed at improving progress. This paper illustrates pedagogical and learning analytics approaches based on practices developed in formal online and distance teaching university education that have been fine-tuned for MOOCs and have been piloted in the context of the EU-funded MOOC projects ECO (Elearning, Communication, Open-Data: http://ecolearning.eu) and EMMA (European Multiple MOOC Aggregator: http://platform.europeanmoocs.eu).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study had three objectives: (1) to develop a comprehensive truck simulation that executes rapidly, has a modular program construction to allow variation of vehicle characteristics, and is able to realistically predict vehicle motion and the tire-road surface interaction forces; (2) to develop a model of doweled portland cement concrete pavement that can be used to determine slab deflection and stress at predetermined nodes, and that allows for the variation of traditional thickness design factors; and (3) to implement these two models on a work station with suitable menu driven modules so that both existing and proposed pavements can be evaluated with respect to design life, given specific characteristics of the heavy vehicles that will be using the facility. This report summarizes the work that has been performed during the first year of the study. Briefly, the following has been accomplished: A two dimensional model of a typical 3-S2 tractor-trailer combination was created. A finite element structural analysis program, ANSYS, was used to model the pavement. Computer runs have been performed varying the parameters defining both vehicle and road elements. The resulting time specific displacements for each node are plotted, and the displacement basin is generated for defined vehicles. Relative damage to the pavement can then be estimated. A damage function resulting from load replications must be assumed that will be reflected by further pavement deterioration. Comparison with actual damage on Interstate 80 will eventually allow verification of these procedures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first Cornell Institute for Healthy Futures (CIHF) roundtable, held in April 2016, brought together senior-level executives, educators, and leaders in senior housing and care to share experiences and exchange ideas. CIHF roundtables are purposely limited to approximately 25 to 30 participants “at the table” to foster discussion on a more intimate basis than traditional conferences. In addition to the formal participants, students, faculty, and guests observed and interacted during the event and attended a separate panel discussion, and reception the evening before. Students, faculty, and industry leaders also met together at a working luncheon session to brainstorm ideas for recruiting and training young talent for careers in the senior housing and care industry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Le dimensionnement basé sur la performance (DBP), dans une approche déterministe, caractérise les objectifs de performance par rapport aux niveaux de performance souhaités. Les objectifs de performance sont alors associés à l'état d'endommagement et au niveau de risque sismique établis. Malgré cette approche rationnelle, son application est encore difficile. De ce fait, des outils fiables pour la capture de l'évolution, de la distribution et de la quantification de l'endommagement sont nécessaires. De plus, tous les phénomènes liés à la non-linéarité (matériaux et déformations) doivent également être pris en considération. Ainsi, cette recherche montre comment la mécanique de l'endommagement pourrait contribuer à résoudre cette problématique avec une adaptation de la théorie du champ de compression modifiée et d'autres théories complémentaires. La formulation proposée adaptée pour des charges monotones, cycliques et de type pushover permet de considérer les effets non linéaires liés au cisaillement couplé avec les mécanismes de flexion et de charge axiale. Cette formulation est spécialement appliquée à l'analyse non linéaire des éléments structuraux en béton soumis aux effets de cisaillement non égligeables. Cette nouvelle approche mise en œuvre dans EfiCoS (programme d'éléments finis basé sur la mécanique de l'endommagement), y compris les critères de modélisation, sont également présentés ici. Des calibrations de cette nouvelle approche en comparant les prédictions avec des données expérimentales ont été réalisées pour les murs de refend en béton armé ainsi que pour des poutres et des piliers de pont où les effets de cisaillement doivent être pris en considération. Cette nouvelle version améliorée du logiciel EFiCoS a démontrée être capable d'évaluer avec précision les paramètres associés à la performance globale tels que les déplacements, la résistance du système, les effets liés à la réponse cyclique et la quantification, l'évolution et la distribution de l'endommagement. Des résultats remarquables ont également été obtenus en référence à la détection appropriée des états limites d'ingénierie tels que la fissuration, les déformations unitaires, l'éclatement de l'enrobage, l'écrasement du noyau, la plastification locale des barres d'armature et la dégradation du système, entre autres. Comme un outil pratique d'application du DBP, des relations entre les indices d'endommagement prédits et les niveaux de performance ont été obtenus et exprimés sous forme de graphiques et de tableaux. Ces graphiques ont été développés en fonction du déplacement relatif et de la ductilité de déplacement. Un tableau particulier a été développé pour relier les états limites d'ingénierie, l'endommagement, le déplacement relatif et les niveaux de performance traditionnels. Les résultats ont démontré une excellente correspondance avec les données expérimentales, faisant de la formulation proposée et de la nouvelle version d'EfiCoS des outils puissants pour l'application de la méthodologie du DBP, dans une approche déterministe.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performance, energy efficiency and cost improvements due to traditional technology scaling have begun to slow down and present diminishing returns. Underlying reasons for this trend include fundamental physical limits of transistor scaling, the growing significance of quantum effects as transistors shrink, and a growing mismatch between transistors and interconnects regarding size, speed and power. Continued Moore's Law scaling will not come from technology scaling alone, and must involve improvements to design tools and development of new disruptive technologies such as 3D integration. 3D integration presents potential improvements to interconnect power and delay by translating the routing problem into a third dimension, and facilitates transistor density scaling independent of technology node. Furthermore, 3D IC technology opens up a new architectural design space of heterogeneously-integrated high-bandwidth CPUs. Vertical integration promises to provide the CPU architectures of the future by integrating high performance processors with on-chip high-bandwidth memory systems and highly connected network-on-chip structures. Such techniques can overcome the well-known CPU performance bottlenecks referred to as memory and communication wall. However the promising improvements to performance and energy efficiency offered by 3D CPUs does not come without cost, both in the financial investments to develop the technology, and the increased complexity of design. Two main limitations to 3D IC technology have been heat removal and TSV reliability. Transistor stacking creates increases in power density, current density and thermal resistance in air cooled packages. Furthermore the technology introduces vertical through silicon vias (TSVs) that create new points of failure in the chip and require development of new BEOL technologies. Although these issues can be controlled to some extent using thermal-reliability aware physical and architectural 3D design techniques, high performance embedded cooling schemes, such as micro-fluidic (MF) cooling, are fundamentally necessary to unlock the true potential of 3D ICs. A new paradigm is being put forth which integrates the computational, electrical, physical, thermal and reliability views of a system. The unification of these diverse aspects of integrated circuits is called Co-Design. Independent design and optimization of each aspect leads to sub-optimal designs due to a lack of understanding of cross-domain interactions and their impacts on the feasibility region of the architectural design space. Co-Design enables optimization across layers with a multi-domain view and thus unlocks new high-performance and energy efficient configurations. Although the co-design paradigm is becoming increasingly necessary in all fields of IC design, it is even more critical in 3D ICs where, as we show, the inter-layer coupling and higher degree of connectivity between components exacerbates the interdependence between architectural parameters, physical design parameters and the multitude of metrics of interest to the designer (i.e. power, performance, temperature and reliability). In this dissertation we present a framework for multi-domain co-simulation and co-optimization of 3D CPU architectures with both air and MF cooling solutions. Finally we propose an approach for design space exploration and modeling within the new Co-Design paradigm, and discuss the possible avenues for improvement of this work in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents an investigation on endoscopic optical coherence tomography (OCT). As a noninvasive imaging modality, OCT emerges as an increasingly important diagnostic tool for many clinical applications. Despite of many of its merits, such as high resolution and depth resolvability, a major limitation is the relatively shallow penetration depth in tissue (about 2∼3 mm). This is mainly due to tissue scattering and absorption. To overcome this limitation, people have been developing many different endoscopic OCT systems. By utilizing a minimally invasive endoscope, the OCT probing beam can be brought to the close vicinity of the tissue of interest and bypass the scattering of intervening tissues so that it can collect the reflected light signal from desired depth and provide a clear image representing the physiological structure of the region, which can not be disclosed by traditional OCT. In this thesis, three endoscope designs have been studied. While they rely on vastly different principles, they all converge to solve this long-standing problem.

A hand-held endoscope with manual scanning is first explored. When a user is holding a hand- held endoscope to examine samples, the movement of the device provides a natural scanning. We proposed and implemented an optical tracking system to estimate and record the trajectory of the device. By registering the OCT axial scan with the spatial information obtained from the tracking system, one can use this system to simply ‘paint’ a desired volume and get any arbitrary scanning pattern by manually waving the endoscope over the region of interest. The accuracy of the tracking system was measured to be about 10 microns, which is comparable to the lateral resolution of most OCT system. Targeted phantom sample and biological samples were manually scanned and the reconstructed images verified the method.

Next, we investigated a mechanical way to steer the beam in an OCT endoscope, which is termed as Paired-angle-rotation scanning (PARS). This concept was proposed by my colleague and we further developed this technology by enhancing the longevity of the device, reducing the diameter of the probe, and shrinking down the form factor of the hand-piece. Several families of probes have been designed and fabricated with various optical performances. They have been applied to different applications, including the collector channel examination for glaucoma stent implantation, and vitreous remnant detection during live animal vitrectomy.

Lastly a novel non-moving scanning method has been devised. This approach is based on the EO effect of a KTN crystal. With Ohmic contact of the electrodes, the KTN crystal can exhibit a special mode of EO effect, termed as space-charge-controlled electro-optic effect, where the carrier electron will be injected into the material via the Ohmic contact. By applying a high voltage across the material, a linear phase profile can be built under this mode, which in turn deflects the light beam passing through. We constructed a relay telescope to adapt the KTN deflector into a bench top OCT scanning system. One of major technical challenges for this system is the strong chromatic dispersion of KTN crystal within the wavelength band of OCT system. We investigated its impact on the acquired OCT images and proposed a new approach to estimate and compensate the actual dispersion. Comparing with traditional methods, the new method is more computational efficient and accurate. Some biological samples were scanned by this KTN based system. The acquired images justified the feasibility of the usage of this system into a endoscopy setting. My research above all aims to provide solutions to implement an OCT endoscope. As technology evolves from manual, to mechanical, and to electrical approaches, different solutions are presented. Since all have their own advantages and disadvantages, one has to determine the actual requirements and select the best fit for a specific application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent developments in micro- and nanoscale 3D fabrication techniques have enabled the creation of materials with a controllable nanoarchitecture that can have structural features spanning 5 orders of magnitude from tens of nanometers to millimeters. These fabrication methods in conjunction with nanomaterial processing techniques permit a nearly unbounded design space through which new combinations of nanomaterials and architecture can be realized. In the course of this work, we designed, fabricated, and mechanically analyzed a wide range of nanoarchitected materials in the form of nanolattices made from polymer, composite, and hollow ceramic beams. Using a combination of two-photon lithography and atomic layer deposition, we fabricated samples with periodic and hierarchical architectures spanning densities over 4 orders of magnitude from ρ=0.3-300kg/m3 and with features as small as 5nm. Uniaxial compression and cyclic loading tests performed on different nanolattice topologies revealed a range of novel mechanical properties: the constituent nanoceramics used here have size-enhanced strengths that approach the theoretical limit of materials strength; hollow aluminum oxide (Al2O3) nanolattices exhibited ductile-like deformation and recovered nearly completely after compression to 50% strain when their wall thicknesses were reduced below 20nm due to the activation of shell buckling; hierarchical nanolattices exhibited enhanced recoverability and a near linear scaling of strength and stiffness with relative density, with E∝ρ1.04 and σy∝ρ1.17 for hollow Al2O3 samples; periodic rigid and non-rigid nanolattice topologies were tested and showed a nearly uniform scaling of strength and stiffness with relative density, marking a significant deviation from traditional theories on “bending” and “stretching” dominated cellular solids; and the mechanical behavior across all topologies was highly tunable and was observed to strongly correlate with the slenderness λ and the wall thickness-to-radius ratio t/a of the beams. These results demonstrate the potential of nanoarchitected materials to create new highly tunable mechanical metamaterials with previously unattainable properties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional engineering design methods are based on Simon's (1969) use of the concept function, and as such collectively suffer from both theoretical and practical shortcomings. Researchers in the field of affordance-based design have borrowed from ecological psychology in an attempt to address the blind spots of function-based design, developing alternative ontologies and design processes. This dissertation presents function and affordance theory as both compatible and complimentary. We first present a hybrid approach to design for technology change, followed by a reconciliation and integration of function and affordance ontologies for use in design. We explore the integration of a standard function-based design method with an affordance-based design method, and demonstrate how affordance theory can guide the early application of function-based design. Finally, we discuss the practical and philosophical ramifications of embracing affordance theory's roots in ecology and ecological psychology, and explore the insights and opportunities made possible by an ecological approach to engineering design. The primary contribution of this research is the development of an integrated ontology for describing and designing technological systems using both function- and affordance-based methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The establishment of support platforms for the development of a new culture in design education, in order to achieve both research exploitation and its results, as an approach to the industrial community, challenges higher education institutions to rethink their functioning, divided between investigation on their own initiative or on demand, and its usefulness / practical application. At the same time, through design education, how can they be the engine that aggregates all these frequently antagonistic interests? Polytechnic institutes are predisposed to collaboration and interdisciplinarity. In our course of Technology and Design of Furniture, the availability of a production unit, testing laboratories, and expertise in engineering, design and marketing, encourage the development of a holistic project. In order to develop such knowledge, we adapt three important ways of thinking in designing interactions influenced by the traditional approach, namely, 1) identifying and understanding a design problem, i.e. a market need, 2) defining the design process and knowing what can be used for design education, i.e. opportunities for design education, and 3) sustainability of this framework and design projects' alignment with education in the same field. We explain our approach by arguing from the academicenterprise experiences perspective. This concept is proposed as a way to achieve those three ways of thinking in design education. Then, a set of interaction attributes is defined to explain how engineering and product design education can enhance meaningful relations with manufacturers, stakeholders and society in general. A final discussion is presented with the implications and benefits of this approach. The results suggest that through academic-enterprise partnerships in design, several goals such as students' motivation, product design innovation and potential for knowledge transfer to industries can be achieved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The cemeterial units, are places of social practices of everyday life and worship and the tomb where nostalgia can be externalized and the memory of the deceased revered. In Western societies we can find a category of artifacts meant to evoke the memory or honor the dead. In this paper we we mention three examples of products that enabled a reflection on the concepts that gave rise to their ways, and that risks to fit them into a new "material culture", in that it may have created a break with the traditional system codes and standards shared by companies, and its manifestations in relation to the physical creation of this category of products. This work offers a reflection on the Design Products.What probably makes it special is the field where it is located: the design of products in one post mortem memory. Usually made of granite rock or marble, have the form of plate or tablet, open book or rolled sheet. On one side have a photograph of the person who intend to honor and inscriptions. The thought of inherent design of this work put on one side the intricate set of emotions that this type of product can generate, and other components more affordable, and concerning the form, function and object interactions with users and with use environments. In the definition of the problem it was regarded as mandatory requirements: differentiation, added value and durability as key objectives.The first two should be manifested in the various components / product attributes. The aesthetic and material/structural durability of product necessarily imply the introduction of qualifying terms and quantitative weights, which positively influence the generation and evaluation of concepts based on the set of 10 principles for the project that originated a matrix as a tool to aid designing products. The concrete definition of a target audience was equally important. At this stage, the collaboration of other experts in the fields of psychology and sociology as disciplines with particular ability to understand individuals and social phenomena respectively was crucial. It was concluded that a product design to honor someone post mortem, should abandon the more traditional habits and customs to focus on identifying new audiences. Although at present it can be considered a niche market, it is believed that in the future may grow as well as their interest in this type of products.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional knowledge associated with genetic resources (TKaGRs) is acknowledged as a valuable resource. Its value draws from economic, social, cultural, and innovative uses. This value places TK at the heart of competing interests as between indigenous peoples who hold it and depend on it for their survival, and profitable industries which seek to exploit it in the global market space. The latter group seek, inter alia, to advance and maintain their global competitiveness by exploiting TKaGRs leads in their research and development activities connected with modern innovation. Biopiracy remains an issue of central concern to the developing world and has emerged in this context as a label for the inequity arising from the misappropriation of TKaGRs located in the South by commercial interests usually located in the North. Significant attention and resources are being channeled at global efforts to design and implement effective protection mechanisms for TKaGRs against the incidence of biopiracy. The emergence and recent entry into force of the Nagoya Protocol offers the latest example of a concluded multilateral effort in this regard. The Nagoya Protocol, adopted on the platform of the Convention on Biological Diversity (CBD), establishes an open-ended international access and benefit sharing (ABS) regime which is comprised of the Protocol as well as several complementary instruments. By focusing on the trans-regime nature of biopiracy, this thesis argues that the intellectual property (IP) system forms a central part of the problem of biopiracy, and so too to the very efforts to implement solutions, including through the Nagoya Protocol. The ongoing related work within the World Intellectual Property Organization (WIPO), aimed at developing an international instrument (or a series of instruments) to address the effective protection of TK, constitutes an essential complementary process to the Nagoya Protocol, and, as such, forms a fundamental element within the Nagoya Protocol’s evolving ABS regime-complex. By adopting a third world approach to international law, this thesis draws central significance from its reconceptualization of biopiracy as a trans-regime concept. By construing the instrument(s) being negotiated within WIPO as forming a central component part of the Nagoya Protocol, this dissertation’s analysis highlights the importance of third world efforts to secure an IP-based reinforcement to the Protocol for the effective eradication of biopiracy.