49 resultados para Desktop simulator
The influence of wear paths produced by hip replacement patients during normal walking on wear rates
Resumo:
Variation in wear paths is known to greatly affect wear rates in vitro, with multidirectional paths producing much greater wear than unidirectional paths. This study investigated the relationship between multidirectional motion at the hip joint, as measured by aspect ratio, sliding distance, and wear rate for 164 hip replacements. Kinematic input from three-dimensional gait analysis was used to determine the wear paths. Activity cycles were determined for a subgroup of 100 patients using a pedometer study, and the relationship between annual sliding distance and wear rate was analyzed. Poor correlations were found between both aspect ratio and sliding distance and wear rate for the larger group and between annual sliding distance and wear rate for the subgroup. However, patients who experienced a wear rate <0.08 mm/year showed a strong positive correlation between the combination of sliding distance, activity levels, and aspect ratio and wear rate (adjusted r2?=?55.4%). This group may represent those patients who experience conditions that most closely match those that prevail in simulator and laboratory tests. Although the shape of wear paths, their sliding distance, and the number of articulation cycles at the hip joint affect wear rates in simulator studies, this relationship was not seen in this clinical study. Other factors such as lubrication, loading conditions and roughness of the femoral head may influence the wear rate.
Resumo:
The present paper demonstrates the suitability of artificial neural network (ANN) for modelling of a FinFET in nano-circuit simulation. The FinFET used in this work is designed using careful engineering of source-drain extension, which simultaneously improves maximum frequency of oscillation f(max) because of lower gate to drain capacitance, and intrinsic gain A(V0) = g(m)/g(ds), due to lower output conductance g(ds). The framework for the ANN-based FinFET model is a common source equivalent circuit, where the dependence of intrinsic capacitances, resistances and dc drain current I-d on drain-source V-ds and gate-source V-gs is derived by a simple two-layered neural network architecture. All extrinsic components of the FinFET model are treated as bias independent. The model was implemented in a circuit simulator and verified by its ability to generate accurate response to excitations not used during training. The model was used to design a low-noise amplifier. At low power (J(ds) similar to 10 mu A/mu m) improvement was observed in both third-order-intercept IIP3 (similar to 10 dBm) and intrinsic gain A(V0) (similar to 20 dB), compared to a comparable bulk MOSFET with similar effective channel length. This is attributed to higher ratio of first-order to third-order derivative of I-d with respect to gate voltage and lower g(ds), in FinFET compared to bulk MOSFET. Copyright (C) 2009 John Wiley & Sons, Ltd.
Resumo:
We study quantum information flow in a model comprised of a trapped impurity qubit immersed in a Bose-Einstein-condensed reservoir. We demonstrate how information flux between the qubit and the condensate can be manipulated by engineering the ultracold reservoir within experimentally realistic limits. We show that this system undergoes a transition from Markovian to non-Markovian dynamics, which can be controlled by changing key parameters such as the condensate scattering length. In this way, one can realize a quantum simulator of both Markovian and non-Markovian open quantum systems, the latter ones being characterized by a reverse flow of information from the background gas (reservoir) to the impurity (system).
Resumo:
A model system, HOOFS (Hierarchical Object Orientated Foraging Simulator), has been developed to study foraging by animals in a complex environment. The model is implemented using an individual-based object-orientated structure. Different species of animals inherit their general properties from a generic animal object which inherits from the basic dynamic object class. Each dynamic object is a separate program thread under the control of a central scheduler. The environment is described as a map of small hexagonal patches, each with their own level of resources and a patch-specific rate of resource replenishment. Each group of seven patches (0th order) is grouped into a Ist order super-patch with seven nth order super-patches making up a n + 1th order super-patch for n up to a specified value. At any time each animal is associated with a single patch. Patch choice is made by combining the information on the resources available within different order patches and super-patches along with information on the spatial location of other animals. The degree of sociality of an animal is defined in terms of optimal spacing from other animals and by the weighting of patch choice based on social factors relative to that based on food availability. Information, available to each animal, about patch resources diminishes with distance from that patch. The model has been used to demonstrate that social interactions can constrain patch choice and result in a short-term reduction of intake and a greater degree of variability in the level of resources in patches. We used the model to show that the effect of this variability on the animal's intake depends on the pattern of patch replenishment. (C) 1998 Elsevier Science B.V. All rights reserved.</p>
Resumo:
The efficient development of multi-threaded software has, for many years, been an unsolved problem in computer science. Finding a solution to this problem has become urgent with the advent of multi-core processors. Furthermore, the problem has become more complicated because multi-cores are everywhere (desktop, laptop, embedded system). As such, they execute generic programs which exhibit very different characteristics than the scientific applications that have been the focus of parallel computing in the past.<br/>Implicitly parallel programming is an approach to parallel pro- gramming that promises high productivity and efficiency and rules out synchronization errors and race conditions by design. There are two main ingredients to implicitly parallel programming: (i) a con- ventional sequential programming language that is extended with annotations that describe the semantics of the program and (ii) an automatic parallelizing compiler that uses the annotations to in- crease the degree of parallelization.<br/>It is extremely important that the annotations and the automatic parallelizing compiler are designed with the target application do- main in mind. In this paper, we discuss the Paralax approach to im- plicitly parallel programming and we review how the annotations and the compiler design help to successfully parallelize generic programs. We evaluate Paralax on SPECint benchmarks, which are a model for such programs, and demonstrate scalable speedups, up to a factor of 6 on 8 cores.
Resumo:
OBJECTIVES:: We assessed the effectiveness of ToT from VR laparoscopic simulation training in 2 studies. In a second study, we also assessed the TER. ToT is a detectable performance improvement between equivalent groups, and TER is the observed percentage performance differences between 2 matched groups carrying out the same task but with 1 group pretrained on VR simulation. Concordance between simulated and in-vivo procedure performance was also assessed. DESIGN:: Prospective, randomized, and blinded. PARTICIPANTS:: In Study 1, experienced laparoscopic surgeons (n = 195) and in Study 2 laparoscopic novices (n = 30) were randomized to either train on VR simulation before completing an equivalent real-world task or complete the real-world task only. RESULTS:: Experienced laparoscopic surgeons and novices who trained on the simulator performed significantly better than their controls, thus demonstrating ToT. Their performance showed a TER between 7% and 42% from the virtual to the real tasks. Simulation training impacted most on procedural error reduction in both studies (32- 42%). The correlation observed between the VR and real-world task performance was r > 096 (Study 2). CONCLUSIONS:: VR simulation training offers a powerful and effective platform for training safer skills.
Resumo:
The document draws largely on the results of research carried out by Hugh McNally and Dominic Morris of McNally Morris Architects and Keith McAllister of Queens University Belfast between 2012 and 2013. The objective of the study was to obtain a greater understanding of the impact that architecture and the built environment can have on people with autism spectrum disorder (ASD). The investigation into the subject centred on parents of young children with ASD in the belief that they are most likely to have an intimate knowledge of the issues that affect their children and are relatively well positioned to communicate those issues. <br/><br/>The study comprised a number of components. <br/><br/>- Focus Group Discussions with parents of children with ASD<br/>- A Postal Questionnaire completed by parents of children with ASD<br/>- A Comprehensive Desktop study of contemporary research into the relationship between ASD and aspects of the built environment.<br/><br/>Social stories are then used to help illustrate the world of a child with ASD to the reader and identify a series of potential difficulties for the pupil with ASD in a primary school setting. Design considerations and mitigating measures are then proposed for each difficulty.<br/><br/>The intention is that the document will raise awareness of some of the issues affecting primary school children with ASD and generate discourse among those whose task it is to provide an appropriate learning environment for all children. This includes teachers, health professionals, architects, parents, carers, school boards, government bodies and those with ASD themselves. <br/><br/>While this document uses the primary school as a lens through which to view some of the issues associated with ASD, it is the authors contention that the school can be seen as a microcosm for the wider world and that lessons taken from the learning environment can be applied elsewhere. The authors therefore hope that the document will help raise awareness of the myriad of issues for those with ASD that are embedded in the vast landscape of urban configurations and building types making up the spatial framework of our society.<br/>
Resumo:
<p>We study the entanglement of two impurity qubits immersed in a Bose-Einstein condensate (BEC) reservoir. This open quantum system model allows for interpolation between a common dephasing scenario and an independent dephasing scenario by modifying the wavelength of the superlattice superposed to the BEC, and how this influences the dynamical properties of the impurities. We demonstrate the existence of rich dynamics corresponding to different values of reservoir parameters, including phenomena such as entanglement trapping, revivals of entanglement, and entanglement generation. In the spirit of reservoir engineering, we present the optimal BEC parameters for entanglement generation and trapping, showing the key role of the ultracold-gas interactions. Copyright (C) EPLA, 2013</p>
Resumo:
<p>We perform an extensive study of the properties of global quantum correlations in finite-size one-dimensional quantum spin models at finite temperature. By adopting a recently proposed measure for global quantum correlations (Rulli and Sarandy 2011 Phys. Rev. A 84 042109), called global discord, we show that critical points can be neatly detected even for many-body systems that are not in their ground state. We consider the transverse Ising model, the cluster-Ising model where three-body couplings compete with an Ising-like interaction, and the nearest-neighbor XX Hamiltonian in transverse magnetic field. These models embody our canonical examples showing the sensitivity of global quantum discord close to criticality. For the Ising model, we find a universal scaling of global discord with the critical exponents pertaining to the Ising universality class.</p>
Resumo:
Abstract. Modern business practices in engineering are increasingly turning to post manufacture service provision in an attempt to generate additional revenue streams and ensure commercial sustainability. Maintainability has always been a consideration during the design process but in the past it has been generally considered to be of tertiary importance behind manufacturability and primary product function in terms of design priorities. The need to draw whole life considerations into concurrent engineering (CE) practice has encouraged companies to address issues such as maintenance, earlier in the design process giving equal importance to all aspects of the product lifecycle. The consideration of design for maintainability (DFM) early in the design process has the potential to significantly reduce maintenance costs, and improve overall running efficiencies as well as safety levels. However a lack of simulation tools still hinders the adaptation of CE to include practical elements of design and therefore further research is required to develop methods by which hands on activities such as maintenance can be fully assessed and optimised as concepts develop. Virtual Reality (VR) has the potential to address this issue but the application of these traditionally high cost systems can require complex infrastructure and their use has typically focused on aesthetic aspects of mature designs. This paper examines the application of cost effective VR technology to the rapid assessment of aircraft interior inspection during conceptual design. It focuses on the integration of VR hardware with a typical desktop engineering system and examines the challenges with data transfer, graphics quality and the development of practical user functions within the VR environment. Conclusions drawn to date indicate that the system has the potential to improve maintenance planning through the provision of a usable environment for inspection which is available as soon as preliminary structural models are generated as part of the conceptual design process. Challenges still exist in the efficient transfer of data between the CAD and VR environments as well as the quantification of any benefits that result from the proposed approach. The result of this research will help to improve product maintainability, reduce product development cycle times and lower maintenance costs.
Resumo:
The commonly used British Standard constant head triaxial permeability test for testing of fine-grained soils is relatively time consuming. A reduction in the required time for soil permeability testing would provide potential cost savings to the construction industry, particularly in the construction quality assurance of landfill clay liners. The purpose of this paper is to evaluate an alternative approach of measuring permeability of fine-grained soils benefiting from accelerated time scaling for seepage flow when testing specimens in elevated gravity conditions provided by a centrifuge. As part of the investigation, an apparatus was designed and produced to measure water flow through soil samples under conditions of elevated gravitational acceleration using a small desktop laboratory centrifuge. A membrane was used to hydrostatically confine the test sample. A miniature data acquisition system was designed and incorporated in the apparatus to monitor and record changes in head and flow throughout the tests. Under enhanced gravity in the centrifuge, the flow through the sample was under variable head' conditions as opposed to constant head' conditions as in the classic constant head permeability tests conducted at 1 g . A mathematical model was developed for analysis of Darcy's coefficient of permeability under conditions of elevated gravitational acceleration and verified using the results obtained. The test data compare well with the results on analogous samples obtained using the classical British Standard constant head permeability tests.
Resumo:
Variations in the phase angle difference between a remote 11kV connected wind farm and the centre of Belfast during a typical working day are investigated in the paper. The results obtained using phasor measurement units (PMUs) are compared with the data generated using a PSS/E simulator configured to model the N.Ireland network. The study investigates the effect of changes in the load demand and the wind farm output power on the phase angles at various locations on the network. The paper finally describes how a major system disturbance on the All-Ireland network was monitored and analysed using PMUs located at Queen's University, Belfast and University College Dublin. 2007 IEEE.
Resumo:
<p>The simulation of open quantum dynamics has recently allowed the direct investigation of the features of system-environment interaction and of their consequences on the evolution of a quantum system. Such interaction threatens the quantum properties of the system, spoiling them and causing the phenomenon of decoherence. Sometimes however a coherent exchange of information takes place between system and environment, memory effects arise and the dynamics of the system becomes non-Markovian. Here we report the experimental realisation of a non-Markovian process where system and environment are coupled through a simulated transverse Ising model. By engineering the evolution in a photonic quantum simulator, we demonstrate the role played by system-environment correlations in the emergence of memory effects.</p>
Resumo:
<p>Background: Modern cancer research often involves large datasets and the use of sophisticated statistical techniques. Together these add a heavy computational load to the analysis, which is often coupled with issues surrounding data accessibility. Connectivity mapping is an advanced bioinformatic and computational technique dedicated to therapeutics discovery and drug re-purposing around differential gene expression analysis. On a normal desktop PC, it is common for the connectivity mapping task with a single gene signature to take >2h to complete using sscMap, a popular Java application that runs on standard CPUs (Central Processing Units). Here, we describe new software, cudaMap, which has been implemented using CUDA C/C++ to harness the computational power of NVIDIA GPUs (Graphics Processing Units) to greatly reduce processing times for connectivity mapping.</p><p>Results: cudaMap can identify candidate therapeutics from the same signature in just over thirty seconds when using an NVIDIA Tesla C2050 GPU. Results from the analysis of multiple gene signatures, which would previously have taken several days, can now be obtained in as little as 10 minutes, greatly facilitating candidate therapeutics discovery with high throughput. We are able to demonstrate dramatic speed differentials between GPU assisted performance and CPU executions as the computational load increases for high accuracy evaluation of statistical significance.</p><p>Conclusion: Emerging 'omics' technologies are constantly increasing the volume of data and information to be processed in all areas of biomedical research. Embracing the multicore functionality of GPUs represents a major avenue of local accelerated computing. cudaMap will make a strong contribution in the discovery of candidate therapeutics by enabling speedy execution of heavy duty connectivity mapping tasks, which are increasingly required in modern cancer research. cudaMap is open source and can be freely downloaded from http://purl.oclc.org/NET/cudaMap.</p>
Resumo:
An unusual application of hydrological understanding to a police search is described. The lacustrine search for a missing person provided reports of bottom-water currents in the lake and contradictory indications from cadaver dogs. A hydrological model of the area was developed using pre-existing information from side scan sonar, a desktop hydrogeological study and deployment of water penetrating radar (WPR). These provided a hydrological theory for the initial search involving subaqueous groundwater flow, focused on an area of bedrock surrounded by sediment, on the lake floor. The work shows the value a hydrological explanation has to a police search operation (equally to search and rescue). With hindsight, the desktop study should have preceded the search, allowing better understanding of water conditions. The ultimate reason for lacustrine flow in this location is still not proven, but the hydrological model explained the problems encountered in the initial search.