49 resultados para Desktop simulator
Resumo:
The burial of objects (human remains, explosives, weapons) below or behind concrete, brick, plaster or tiling may be associated with serious crime and are difficult locations to search. These are quite common forensic search scenarios but little has been published on them to-date. Most documented discoveries are accidental or from suspect/witness testimony. The problem in locating such hidden objects means a random or chance-based approach is not advisable. A preliminary strategy is presented here, based on previous studies, augmented by primary research where new technology or applications are required. This blend allows a rudimentary search workflow, from remote desktop study, to non-destructive investigation through to recommendations as to how the above may inform excavation, demonstrated here with a case study from a homicide investigation. Published case studies on the search for human remains demonstrate the problems encountered when trying to find and recover sealed-in and sealed over locations. Established methods include desktop study, photography, geophysics and search dogs:these are integrated with new technology (LiDAR and laser scanning; photographic rectification; close quarter aerial imagery; ground-penetrating radar on walls and gamma-ray/neutron activation radiography) to propose this possible search strategy.
Resumo:
<p>The current study focuses on the effect of the material type and the lubricant on the abrasive wear behaviour of two important commercially available ceramic on ceramic prosthetic systems, namely, Biolox(R) forte and Bioloxl(R) delta (CeramTec AG, Germany). A standard microabrasion wear apparatus was used to produce '3-body' abrasive wear scars with three different lubricants: ultrapure water, 25 vol% new-born calf serum solution and 1 wt% carboxymethyl cellulose sodium salt (CMC-Na) solution. 1 mu m alumina particles were used as the abrasive. The morphology of the wear scar was examined in detail using Atomic Force Microscopy (AFM) and Scanning Electron Microscopy (SEM). Subsurface damage accumulation was investigated by Focused Ion Beam (FIB) cross-sectional milling and Transmission Electron Microscopy (TEM). The effect of the lubricant on the '3-body' abrasive wear mechanisms is discussed and the effect of material properties compared. (C) 2009 Elsevier B.V. All rights reserved.</p>
Resumo:
This paper presents the results of a measurement campaign aimed at characterizing and modeling the indoor radio channel between two hypothetical cellular handsets. The device-to-device channel measurements were made at 868 MHz and investigated a number of different everyday scenarios such as the devices being held at the user's heads, placed in a pocket and one of the devices placed on a desktop. The recently proposed shadowed k-μ fading model was used to characterize these channels and was shown to provide a good description of the measured data. It was also evident from the experiments, that the device-to-device communications channel is susceptible to shadowing caused by the human body.
Resumo:
The properties of metasurfaces composed of doubly periodic arrays of interwoven quadrifilar spiral conductors on magnetized ferrite substrates have been investigated with the aid of the full-wave electromagnetic simulator. The effects of incident wave polarization and ferrite magnetization on the scattering characteristics have been analysed at both normal and in-plane dc magnetic bias. The features of the fundamental topological resonances in the interwoven spiral arrays on ferrite substrates are illustrated by the simulation results and the effects of ferrite gyrotropy and dispersion on the array resonance response and fractional bandwidth are discussed.
Resumo:
High Fidelity Simulation or Human Patient Simulation is an educational strategy embedded within nursing curricula throughout many healthcare educational institutions. This paper reports on an evaluative study that investigated the views of a group of Year 2 undergraduate nursing students from the mental health and the learning disability fields of nursing (n = 75) in relation to simulation as a teaching pedagogy. The study took place in the simulation suite within a School of Nursing and Midwifery in the UK. Two patient scenarios were used for the session and participants completed a 22-item questionnaire consisting of three biographical information questions and a 19-item Likert scale. Descriptive statistics were employed to illustrate the data and non-parametric testing (Mann-Whitney U test) was employed to test a number of hypotheses. Overall students were positive about the introduction of patient scenarios using the human patient simulator into the undergraduate nursing curriculum. This study used a small, convenience sample in one institution and therefore the results obtained cannot be generalised to nursing education before further research can be conducted with larger samples and a mixed-method research approach. However these results provide encouraging evidence to support the use of simulation within the mental health and the learning disability fields of nursing, and the development and implementation of further simulations to complement the students’ practicum.<br/>
Resumo:
The principle feature in the evolution of the internet has been its ever growing reach to include old and young, rich and poor. The internet’s ever encroaching presence has transported it from our desktop to our pocket and into our glasses. This is illustrated in the Internet Society Questionnaire on Multistakeholder Governance, which found the main factors affecting change in the Internet governance landscape were more users online from more countries and the influence of the internet over daily life. The omnipresence of the internet is self- perpetuating; its usefulness grows with every new user and every new piece of data uploaded. The advent of social media and the creation of a virtual presence for each of us, even when we are not physically present or ‘logged on’, means we are fast approaching the point where we are all connected, to everyone else, all the time. We have moved far beyond the point where governments can claim to represent our views which evolve constantly rather than being measured in electoral cycles.<br/>The shift, which has seen citizens as creators of content rather than consumers of it, has undermined the centralist view of democracy and created an environment of wiki democracy or crowd sourced democracy. This is at the heart of what is generally known as Web 2.0, and widely considered to be a positive, democratising force. However, we argue, there are worrying elements here too. Government does not always deliver on the promise of the networked society as it involves citizens and others in the process of government. Also a number of key internet companies have emerged as powerful intermediaries harnessing the efforts of the many, and re- using and re-selling the products and data of content providers in the Web 2.0 environment. A discourse about openness and transparency has been offered as a democratising rationale but much of this masks an uneven relationship where the value of online activity flows not to the creators of content but to those who own the channels of communication and the metadata that they produce.<br/>In this context the state is just one stakeholder in the mix of influencers and opinion formers impacting on our behaviours, and indeed our ideas of what is public. The question of what it means to create or own something, and how all these new relationships to be ordered and governed are subject to fundamental change. While government can often appear slow, unwieldy and even irrelevant in much of this context, there remains a need for some sort of political control to deal with the challenges that technology creates but cannot by itself control. In order for the internet to continue to evolve successfully both technically and socially it is critical that the multistakeholder nature of internet governance be understood and acknowledged, and perhaps to an extent, re- balanced. Stakeholders can no longer be classified in the broad headings of government, private sector and civil society, and their roles seen as some sort of benign and open co-production. Each user of the internet has a stake in its efficacy and each by their presence and participation is contributing to the experience, positive or negative of other users as well as to the commercial success or otherwise of various online service providers. However stakeholders have neither an equal role nor an equal share. The unequal relationship between the providers of content and those who simple package up and transmit that content - while harvesting the valuable data thus produced - needs to be addressed. Arguably this suggests a role for government that involves it moving beyond simply celebrating and facilitating the on- going technological revolution. This paper reviews the shifting landscape of stakeholders and their contribution to the efficacy of the internet. It will look to critically evaluate the primacy of the individual as the key stakeholder and their supposed developing empowerment within the ever growing sea of data. It also looks at the role of individuals in wider governance roles. Governments in a number of jurisdictions have sought to engage, consult or empower citizens through technology but in general these attempts have had little appeal. Citizens have been too busy engaging, consulting and empowering each other to pay much attention to what their governments are up to. George Orwell’s view of the future has not come to pass; in fact the internet has insured the opposite scenario has come to pass. There is no big brother but we are all looking over each other’s shoulder all the time, while at the same time a number of big corporations are capturing and selling all this collective endeavour back to us.
Resumo:
Burial grounds are commonly surveyed and searched by both police/humanitarian search teams and archaeologists.<br/>One aspect of an efficient search is to establish areas free of recent internments to allow the concentration of assets in suspect<br/>terrain. While 100% surety in locating remains can never be achieved, the deployment of a red, amber green (RAG) system for<br/>assessment has proven invaluable to our surveys. The RAG system is based on a desktop study (including burial ground<br/>records), visual inspection (mounding, collapses) and use of geophysics (in this case, ground penetrating radar or GPR) for a<br/>multi-proxy assessment that provides search authorities an assessment of the state of inhumations and a level of legal backup<br/>for decisions they make on excavation or not (‘exit strategy’). The system is flexible and will be built upon as research<br/>continues.
Resumo:
<p>Today there is a growing interest in the integration of health monitoring applications in portable devices necessitating the development of methods that improve the energy efficiency of such systems. In this paper, we present a systematic approach that enables energy-quality trade-offs in spectral analysis systems for bio-signals, which are useful in monitoring various health conditions as those associated with the heart-rate. To enable such trade-offs, the processed signals are expressed initially in a basis in which significant components that carry most of the relevant information can be easily distinguished from the parts that influence the output to a lesser extent. Such a classification allows the pruning of operations associated with the less significant signal components leading to power savings with minor quality loss since only less useful parts are pruned under the given requirements. To exploit the attributes of the modified spectral analysis system, thresholding rules are determined and adopted at design- and run-time, allowing the static or dynamic pruning of less-useful operations based on the accuracy and energy requirements. The proposed algorithm is implemented on a typical sensor node simulator and results show up-to 82% energy savings when static pruning is combined with voltage and frequency scaling, compared to the conventional algorithm in which such trade-offs were not available. In addition, experiments with numerous cardiac samples of various patients show that such energy savings come with a 4.9% average accuracy loss, which does not affect the system detection capability of sinus-arrhythmia which was used as a test case. </p>
Resumo:
A low cost solar collector was developed by using polymeric components as opposed to metal and glass components of traditional solar collectors. In order to utilize polymers for the absorber of the solar collector, Carbon Nanotubes (CNT) has been added as a filler to improve the thermal conductivity and the solar absorptivity of polymers. The solar collector was designed as a multi-layer construction with considering the economic manufacturing. Through the mathematical heat transfer analysis, the performance and characteristics of the designed solar collector have been estimated. Furthermore, the prototypes of the proposed system were built and tested at a state-of-the-art solar simulator facility to evaluate the actual performance of the developed solar collector. The cost-effective polymer-CNT solar collector, which achieved efficiency as much as that of a conventional glazed flat plate solar panel, has been successfully developed.
Resumo:
The Copney Stone Circle Complex, Co. Tyrone, N. Ireland, is an important Bronze Age site forming part of the Mid-Ulster Stone Circle Complex. The Environment Service: Historic Monuments and Buildings (ESHMB) initiated a program of bog-clearance in August 1994 to excavate the stone circles. This work was completed by October 1994 and the excavated site was surveyed in August 1995. Almost immediately, the rate at which the stones forming the circles were breaking down was noted and a program of study initiated to make recommendations upon the conservation of this important site. Digital photogrammetric techniques were applied to aerial images of the stone circles and digital terrain models created from the images at a range of scales. These provide base data sets for comparison with identical surveys to be completed in successive years and will allow the rate of deterioration, and the areas most affected, of the circles to be determined. In addition, a 2D analysis of the stones provides an accurate analysis of the absolute 2D dimensions of the stones for rapid desktop computer analysis by researchers remote from the digital photogrammetric workstation used in the survey.<br/><br/>The products of this work are readily incorporated into web sites, educational packages and databases. The technique provides a rapid and user friendly method of presentation of a large body of information and measurements, and a reliable method of storage of the information from Copney should it become necessary to re-cover the site.
Resumo:
Power, and consequently energy, has recently attained first-class system resource status, on par with conventional metrics such as CPU time. To reduce energy consumption, many hardware- and OS-level solutions have been investigated. However, application-level information - which can provide the system with valuable insights unattainable otherwise - was only considered in a handful of cases. We introduce OpenMPE, an extension to OpenMP designed for power management. OpenMP is the de-facto standard for programming parallel shared memory systems, but does not yet provide any support for power control. Our extension exposes (i) per-region multi-objective optimization hints and (ii) application-level adaptation parameters, in order to create energy-saving opportunities for the whole system stack. We have implemented OpenMPE support in a compiler and runtime system, and empirically evaluated its performance on two architectures, mobile and desktop. Our results demonstrate the effectiveness of OpenMPE with geometric mean energy savings across 9 use cases of 15Â % while maintaining full quality of service.
Resumo:
A low cost flat plate solar collector was developed by using polymeric components as opposed to metal and glass components of traditional flat plate solar collectors. In order to improve the thermal and optical properties of the polymer absorber of the solar collector, Carbon Nanotubes (CNT) were added as a filler. The solar collector was designed as a multi-layer construction with an emphasis on low manufacturing costs. Through the mathematical heat transfer analysis, the thermal performance of the collector and the characteristics of the design parameters were analyzed. Furthermore, the prototypes of the proposed collector were built and tested at a state-of-the-art solar simulator facility to evaluate its actual performance. The inclusion of CNT improved significantly the properties of the polymer absorber. The key design parameters and their effects on the thermal performance were identified via the heat transfer analysis. Based on the experimental and analytical results, the cost-effective polymer-CNT solar collector, which achieved a high thermal efficiency similar to that of a conventional glazed flat plate solar panel, was successfully developed.
Resumo:
The continued use of traditional lecturing across Higher Education as the main teaching and learning approach in many disciplines must be challenged. An increasing number of studies suggest that this approach, compared to more active learning methods, is the least effective. In counterargument, the use of traditional lectures are often justified as necessary given a large student population. By analysing the implementation of a web based broadcasting approach which replaced the traditional lecture within a programming-based module, and thereby removed the student population rationale, it was hoped that the student learning experience would become more active and ultimately enhance learning on the module. The implemented model replaces the traditional approach of students attending an on-campus lecture theatre with a web-based live broadcast approach that focuses on students being active learners rather than passive recipients. Students ‘attend’ by viewing a live broadcast of the lecturer, presented as a talking head, and the lecturer’s desktop, via a web browser. Video and audio communication is primarily from tutor to students, with text-based comments used to provide communication from students to tutor. This approach promotes active learning by allowing student to perform activities on their own computer rather than the passive viewing and listening common encountered in large lecture classes. By analysing this approach over two years (n = 234 students) results indicate that 89.6% of students rated the approach as offering a highly positive learning experience. Comparing student performance across three academic years also indicates a positive change. A small data analytic analysis was conducted into student participation levels and suggests that the student cohort's willingness to engage with the broadcast lectures material is high.
Resumo:
<p>Static timing analysis provides the basis for setting the clock period of a microprocessor core, based on its worst-case critical path. However, depending on the design, this critical path is not always excited and therefore dynamic timing margins exist that can theoretically be exploited for the benefit of better speed or lower power consumption (through voltage scaling). This paper introduces predictive instruction-based dynamic clock adjustment as a technique to trim dynamic timing margins in pipelined microprocessors. To this end, we exploit the different timing requirements for individual instructions during the dynamically varying program execution flow without the need for complex circuit-level measures to detect and correct timing violations. We provide a design flow to extract the dynamic timing information for the design using post-layout dynamic timing analysis and we integrate the results into a custom cycle-accurate simulator. This simulator allows annotation of individual instructions with their impact on timing (in each pipeline stage) and rapidly derives the overall code execution time for complex benchmarks. The design methodology is illustrated at the microarchitecture level, demonstrating the performance and power gains possible on a 6-stage OpenRISC in-order general purpose processor core in a 28nm CMOS technology. We show that employing instruction-dependent dynamic clock adjustment leads on average to an increase in operating speed by 38% or to a reduction in power consumption by 24%, compared to traditional synchronous clocking, which at all times has to respect the worst-case timing identified through static timing analysis.</p>
Resumo:
The X-parameter based nonlinear modelling tools have been adopted as the foundation for the advanced methodology<br/>of experimental characterisation and design of passive nonlinear devices. Based upon the formalism of the Xparameters,<br/>it provides a unified framework for co-design of antenna beamforming networks, filters, phase shifters and<br/>other passive and active devices of RF front-end, taking into account the effect of their nonlinearities. The equivalent<br/>circuits of the canonical elements are readily incorporated in the models, thus enabling evaluation of PIM effect on the<br/>performance of individual devices and their assemblies. An important advantage of the presented methodology is its<br/>compatibility with the industry-standard established commercial RF circuit simulator Agilent ADS.<br/>The major challenge in practical implementation of the proposed approach is concerned with experimental retrieval of the X-parameters for canonical passive circuit elements. To our best knowledge commercial PIM testers and practical laboratory test instruments are inherently narrowband and do not allow for simultaneous vector measurements at the PIM and harmonic frequencies. Alternatively, existing nonlinear vector analysers (NVNA) support X-parameter measurements in a broad frequency bands with a range of stimuli, but their dynamic range is insufficient for the PIM characterisation in practical circuits. Further opportunities for adaptation of the X-parameters methodology to the PIM<br/>characterisation of passive devices using the existing test instruments are explored.