932 resultados para Designers
Resumo:
Rotational moulding promises designers attractive economics and a low-pressure process. The benefits of rotational moulding are compared here with other manufacturing methods such as injection and blow moulding.
Resumo:
Mechanical swivel seat adaptations are a key aftermarket disability modification to any small-to medium-sized passenger vehicle. However, the crashworthiness of these devices is currently unregulated and the existing 20g dynamic sled testing approach is prohibitively expensive for prototype assessment purposes. In this paper, an alternative quasi-static test method for swivel seat assessment is presented, and two different approaches (free-body diagram and multibody modelling) validated through published experimental data are developed to determine the appropriate loading conditions to apply in the quasi-static testing.Results show the two theoretical approaches can give similar results for estimating the quasi-static loading conditions, and this depends on the seatbelt configuration. Application of the approach to quasi-static testing of both conventional seats and those with integrated seat belts showed the approach to be successful and easy to apply. It is proposed that this method be used by swivel seat designers to assess new prototypes prior to final validation via the traditional 20g sled test.
Resumo:
Shape memory NiTi alloys have been used extensively for medical device applications such as orthopedic, dental, vascular and cardiovascular devices on account of their unique shape memory effect (SME) and super-elasticity (SE). Laser welding is found to be the most suitable method used to fabricate NiTi-based medical components. However, the performance of laser-welded NiTi alloys under corrosive environments is not fully understood and a specific focus on understanding the corrosion fatigue behaviour is not evident in the literature. This study reveals a comparison of corrosion fatigue behaviour of laser-welded and bare NiTi alloys using bending rotation fatigue (BRF) test which was integrated with a specifically designed corrosion cell. The testing environment was Hanks’ solution (simulated body fluid) at 37.5oC. Electrochemical impedance spectroscopic (EIS) measurement was carried out to monitor the change of corrosion resistance at different periods during the BRF test. Experiments indicate that the laser-welded NiTi alloy would be more susceptible to the corrosion fatigue attack than the bare NiTi alloy. This finding can serve as a benchmark for the product designers and engineers to determine the factor of safety of NiTi medical devices fabricated using laser welding.
Resumo:
In the digital age, the hyperspace of virtual reality systems stands out as a new spatial concept creating a parallel realm to "real" space. Virtual reality influences one’s experience of and interaction with architectural space. This "otherworld" brings up the criticism of the existing conception of space, time and body. Hyperspaces are relatively new to designers but not to filmmakers. Their cinematic representations help the comprehension of the outcomes of these new spaces. Visualisation of futuristic ideas on the big screen turns film into a medium for spatial experimentation. Creating a possible future, The Matrix (Andy and Larry Wachowski, 1999) takes the concept of hyperspace to a level not-yet-realised but imagined. With a critical gaze at the existing norms of architecture, the film creates new horizons in terms of space. In this context, this study introduces science fiction cinema as a discussion medium to understand the potentials of virtual reality systems for the architecture of the twenty first century. As a "role model" cinema helps to better understand technological and spatial shifts. It acts as a vehicle for going beyond the spatial theories and designs of the twentieth century, and defining the conception of space in contemporary architecture.
Resumo:
Hardware designers and engineers typically need to explore a multi-parametric design space in order to find the best configuration for their designs using simulations that can take weeks to months to complete. For example, designers of special purpose chips need to explore parameters such as the optimal bitwidth and data representation. This is the case for the development of complex algorithms such as Low-Density Parity-Check (LDPC) decoders used in modern communication systems. Currently, high-performance computing offers a wide set of acceleration options, that range from multicore CPUs to graphics processing units (GPUs) and FPGAs. Depending on the simulation requirements, the ideal architecture to use can vary. In this paper we propose a new design flow based on OpenCL, a unified multiplatform programming model, which accelerates LDPC decoding simulations, thereby significantly reducing architectural exploration and design time. OpenCL-based parallel kernels are used without modifications or code tuning on multicore CPUs, GPUs and FPGAs. We use SOpenCL (Silicon to OpenCL), a tool that automatically converts OpenCL kernels to RTL for mapping the simulations into FPGAs. To the best of our knowledge, this is the first time that a single, unmodified OpenCL code is used to target those three different platforms. We show that, depending on the design parameters to be explored in the simulation, on the dimension and phase of the design, the GPU or the FPGA may suit different purposes more conveniently, providing different acceleration factors. For example, although simulations can typically execute more than 3x faster on FPGAs than on GPUs, the overhead of circuit synthesis often outweighs the benefits of FPGA-accelerated execution.
Resumo:
The focus of this paper is to outline a method for consolidating and implementing the work on performance-based specification and testing. First part of the paper will review the mathematical significance of the variables used in common service life models. The aim is to identify a set of significant variables that influence the ingress of chloride ions into concrete. These variables are termed as Key Performance Indicators (KPI’s). This will also help to reduce the complexity of some of the service life models and make them more appealing for practicing engineers. The second part of the paper presents a plan for developing a database based on these KPI’s so that relationships can then be drawn between common concrete mix parameters and KPI’s. This will assist designers in specifying a concrete with adequate performance for a particular environment. This, collectively, is referred to as the KPI based approach and the concluding remarks will outline how the authors envisage the KPI theory to relate to performance assessment and monitoring.
Resumo:
A series of small-scale tests was undertaken to verify if granular anchors could be used as a slope stabilisation technique. The nature of the material used and the resulting loading configuration are described here. The work confirms that the inclusion of anchors within a slope mass, irrespective of their number or orientation, significantly enhances the capacity and ductility of the failure mode. The small-scale nature of this research did influence the observed capacities, but the overarching hypothesis was confirmed. A simple analysis method is proposed that allows designers to accurately remediate natural or man-made slopes using existing analytical methods for slope stability.
Resumo:
This paper explores the production and post-production techniques and tensions in designing sound for film. Considering the films of Lucrecia Martel and Sofia Coppola, amongst others, Greene and Yang will discuss how the soundtrack takes on a primary role in these films and becomes a medium for symbolism, reflection, characterisation, as well as storytelling. There will be a close examination of the processes involved in creating character-orientated soundscapes. These processes are sensitive to the effects sound has on an audience. Exploring how these filmmakers (with their sound teams) utilise the listening experience, including attention to point of audition and sound perception, this paper will critically unpick how such creative decisions are arrived at during various stages of the production process. Outlining the use of diegetic and non-diegetic sound and the potential musicality of sound effect design, issues of reverberation, noise and intent are discussed to highlight the sonic framing of these creative teams. Greene will approach these soundtracks from a production/post-production perspective, while Yang will explore the composer’s/designer’s ear.
Resumo:
Identifying responsibility for classes in object oriented software design phase is a crucial task. This paper proposes an approach for producing high quality and robust behavioural diagrams (e.g. Sequence Diagrams) through Class Responsibility Assignment (CRA). GRASP or General Responsibility Assignment Software Pattern (or Principle) was used to direct the CRA process when deriving behavioural diagrams. A set of tools to support CRA was developed to provide designers and developers with a cognitive toolkit that can be used when analysing and designing object-oriented software. The tool developed is called Use Case Specification to Sequence Diagrams (UC2SD). UC2SD uses a new approach for developing Unified Modelling Language (UML) software designs from Natural Language, making use of a meta-domain oriented ontology, well established software design principles and established Natural Language Processing (NLP) tools. UC2SD generates a well-formed UML sequence diagrams as output.
Resumo:
One of the many definitions of inclusive design is that it is a user-led approach to design. To date its focus has been on ‘critical’ users, in particular disabled people. As such, there is pressure to design environments that meet the often urgent and complex demands of these users. Designers, uncertain of their knowledge, rely heavily on user input and guidance, often resulting in designs that are ‘solution’ driven (rather than solution seeking) and short term; users focus on what they need, not what they might need. This paper argues that design needs to reclaim an equal presence within inclusive design. It proposes that the ‘weakness’ of design lies in the uneasy and at times conflicting relationship between ethics and aesthetics. The paper itself is constructed around a dialogue between two academics, one concerned with critical user needs, the other with aesthetics, but both directed towards the support of design quality
Resumo:
The design cycle for complex special-purpose computing systems is extremely costly and time-consuming. It involves a multiparametric design space exploration for optimization, followed by design verification. Designers of special purpose VLSI implementations often need to explore parameters, such as optimal bitwidth and data representation, through time-consuming Monte Carlo simulations. A prominent example of this simulation-based exploration process is the design of decoders for error correcting systems, such as the Low-Density Parity-Check (LDPC) codes adopted by modern communication standards, which involves thousands of Monte Carlo runs for each design point. Currently, high-performance computing offers a wide set of acceleration options that range from multicore CPUs to Graphics Processing Units (GPUs) and Field Programmable Gate Arrays (FPGAs). The exploitation of diverse target architectures is typically associated with developing multiple code versions, often using distinct programming paradigms. In this context, we evaluate the concept of retargeting a single OpenCL program to multiple platforms, thereby significantly reducing design time. A single OpenCL-based parallel kernel is used without modifications or code tuning on multicore CPUs, GPUs, and FPGAs. We use SOpenCL (Silicon to OpenCL), a tool that automatically converts OpenCL kernels to RTL in order to introduce FPGAs as a potential platform to efficiently execute simulations coded in OpenCL. We use LDPC decoding simulations as a case study. Experimental results were obtained by testing a variety of regular and irregular LDPC codes that range from short/medium (e.g., 8,000 bit) to long length (e.g., 64,800 bit) DVB-S2 codes. We observe that, depending on the design parameters to be simulated, on the dimension and phase of the design, the GPU or FPGA may suit different purposes more conveniently, thus providing different acceleration factors over conventional multicore CPUs.
Resumo:
It is acknowledged that one of the consequences of the ageing process is cognitive decline, which leads to an increase in the incidence of illnesses such as dementia. This has become ever more relevant due to the projected increase in the ageing demographic. Dementia affects visuo-spatial perception, causing difficulty with wayfinding, even during the early stages of the disease. The literature widely recognises the physical environment’s role in alleviating symptoms of dementia and improving quality of life for residents. It also identifies the lack of available housing options for older people with dementia and consequently the current stock is ill-equipped to provide adequate support.
Recent statistics indicate that 80% of those residing in nursing or residential care homes have some form of dementia or severe memory problems. The shift towards institutional care settings, the need for specialist support and care, places a greater impetus on the need for a person-centred approach to tackle issues related to wayfinding and dementia.
This thesis therefore aims to improve design for dementia in nursing and residential care settings in the context of Northern Ireland. This will be undertaken in order to provide a better understanding of how people with dementia experience the physical environment and to highlight features of the design that assist with wayfinding. Currently there are limited guidelines on design for dementia, meaning that many of these are theoretical, anecdotal and not definitive. Hence a greater verification to address the less recognised design issues is required. This is intended to ultimately improve quality of life, wellbeing, independence and uphold the dignity of people with dementia living in nursing or residential care homes.
The research design uses a mixed methods approach. A thorough preparation and consideration of ethical issues informed the methodology. The various facets were also trialled and piloted to identify any ethical, technological, methodological, data collection and analysis issues. The protocol was then amended to improve or resolve any of the aforementioned issues. Initially a questionnaire based on leading design recommendations was conducted with home managers. Semi-structured interviews were developed from this and conducted with staff and resident’s next of kin. An evidence-based approach was used to design a study which used ethnographic methods, including a wayfinding task. This followed a repeated measures design which would be used to actively engage residents with dementia in the research. Complementary to the wayfinding task, conversational and semi-structured interviews were used to promote dialogue and direct responses with the person with dementia. In addition to this, Space Syntax methodologies were used to examine the physical properties of the architectural layout. This was then cross-examined with interview responses and data from the wayfinding tasks.
A number of plan typologies were identified and were determined as synonymous with decision point types which needed to be made during the walks. The empirical work enabled the synthesis of environmental features which support wayfinding.
Results indicate that particular environmental features are associated with improved performance on the wayfinding tasks. By enhancing design for dementia, through identifying the attributes, challenges with wayfinding may be overcome and the benefits of the physical environment can be seen to promote wellbeing.
The implications of this work mean that the environmental features which have been highlighted from the project can be used to inform guidelines, thus adding to existing knowledge. Future work would involve the dissemination of this information and the potential for it to be made into design standards or regulations which champion design for dementia. These would increase awareness for designers and stakeholders undertaking new projects, extensions or refurbishments.
A person-centred, evidence-based design was emphasised throughout the project which guaranteed an in-depth study. There were limitations due to the available resources, time and funding. Future research would involve testing the identified environmental features within a specific environment to enable measured observation of improvements.
Resumo:
DRAM technology faces density and power challenges to increase capacity because of limitations of physical cell design. To overcome these limitations, system designers are exploring alternative solutions that combine DRAM and emerging NVRAM technologies. Previous work on heterogeneous memories focuses, mainly, on two system designs: PCache, a hierarchical, inclusive memory system, and HRank, a flat, non-inclusive memory system. We demonstrate that neither of these designs can universally achieve high performance and energy efficiency across a suite of HPC workloads. In this work, we investigate the impact of a number of multilevel memory designs on the performance, power, and energy consumption of applications. To achieve this goal and overcome the limited number of available tools to study heterogeneous memories, we created HMsim, an infrastructure that enables n-level, heterogeneous memory studies by leveraging existing memory simulators. We, then, propose HpMC, a new memory controller design that combines the best aspects of existing management policies to improve performance and energy. Our energy-aware memory management system dynamically switches between PCache and HRank based on the temporal locality of applications. Our results show that HpMC reduces energy consumption from 13% to 45% compared to PCache and HRank, while providing the same bandwidth and higher capacity than a conventional DRAM system.
Resumo:
As the designers of modern automotive turbochargers strive to increase map width and lower the mass flow rate at which compressor surge occurs, the recirculating flows at the impeller inlet are becoming a much more relevant aerodynamic feature. Compressors with relatively large map widths tend to have very large recirculating regions at the inlet when operating close to surge; these regions greatly affect the expected performance of the compressor.
This study analyses the inlet recirculation region numerically using several modern automotive turbocharger centrifugal compressors. Using 3D Computational Fluid Dynamics (CFD) and a single passage model, the point at which the recirculating flow begins to develop and the rate at which it grows are investigated. All numerical modelling has been validated using measurements taken from hot gas stand tests for all compressor stages. The paper improves upon an existing correlation between the rate of development of the recirculating region and the compressor stage, which is supported by results from the numerical analysis.
Resumo:
Prescribing tasks, which involve pharmacological knowledge, clinical decision-making and practical skill, take place within unpredictable social environments and involve interactions within and between endlessly changing health care teams. Despite this, curriculum designers commonly assume them to be simple to learn and perform. This research used mixed methods to explore how undergraduate medical students learn to prescribe in the 'real world'. It was informed by cognitive psychology, sociocultural theory, and systems thinking. We found that learning to prescribe occurs as a dynamic series of socially negotiated interactions within and between individuals, communities and environments. As well as a thematic analysis, we developed a framework of three conceptual spaces in which learning opportunities for prescribing occur. This illustrates a complex systems view of prescribing education and defines three major system components: the "social space", where the environmental conditions influence or bring about a learning experience; the "process space", describing what happens during the learning experience; and the intra-personal "cognitive space", where the learner may develop aspects of prescribing expertise. This conceptualisation broadens the scope of inquiry of prescribing education research by highlighting the complex interplay between individual and social dimensions of learning. This perspective is also likely to be relevant to students' learning of other clinical competencies.