945 resultados para execution traces
Resumo:
An interactive graphics package for modeling with Petri Nets has been implemented. It uses the VT-11 graphics terminal supported on the PDP-11/35 computer to draw, execute, analyze, edit and redraw a Petri Net. Each of the above mentioned tasks can be performed by selecting appropriate items from a menu displayed on the screen. Petri Nets with a reasonably large number of nodes can be created and analyzed using this package. The number of nodes supported may be increased by making simple changes in the program. Being interactive, the program seeks information from the user after displaying appropriate messages on the terminal. After completing the Petri Net, it may be executed step by step and the changes in the number of tokens may be observed on the screen, at each place. Some properties of Petri Nets like safety, boundedness, conservation and redundancy can be checked using this package. This package can be used very effectively for modeling asynchronous (concurrent) systems with Petri Nets and simulating the model by “graphical execution.”
Resumo:
In this thesis, the genetic variation of human populations from the Baltic Sea region was studied in order to elucidate population history as well as evolutionary adaptation in this region. The study provided novel understanding of how the complex population level processes of migration, genetic drift, and natural selection have shaped genetic variation in North European populations. Results from genome-wide, mitochondrial DNA and Y-chromosomal analyses suggested that the genetic background of the populations of the Baltic Sea region lies predominantly in Continental Europe, which is consistent with earlier studies and archaeological evidence. The late settlement of Fennoscandia after the Ice Age and the subsequent small population size have led to pronounced genetic drift, especially in Finland and Karelia but also in Sweden, evident especially in genome-wide and Y-chromosomal analyses. Consequently, these populations show striking genetic differentiation, as opposed to much more homogeneous pattern of variation in Central European populations. Additionally, the eastern side of the Baltic Sea was observed to have experienced eastern influence in the genome-wide data as well as in mitochondrial DNA and Y-chromosomal variation – consistent with linguistic connections. However, Slavic influence in the Baltic Sea populations appears minor on genetic level. While the genetic diversity of the Finnish population overall was low, genome-wide and Y-chromosomal results showed pronounced regional differences. The genetic distance between Western and Eastern Finland was larger than for many geographically distant population pairs, and provinces also showed genetic differences. This is probably mainly due to the late settlement of Eastern Finland and local isolation, although differences in ancestral migration waves may contribute to this, too. In contrast, mitochondrial DNA and Y-chromosomal analyses of the contemporary Swedish population revealed a much less pronounced population structure and a fusion of the traces of ancient admixture, genetic drift, and recent immigration. Genome-wide datasets also provide a resource for studying the adaptive evolution of human populations. This study revealed tens of loci with strong signs of recent positive selection in Northern Europe. These results provide interesting targets for future research on evolutionary adaptation, and may be important for understanding the background of disease-causing variants in human populations.
Resumo:
Information exchange (IE) is a critical component of the complex collaborative medication process in residential aged care facilities (RACFs). Designing information and communication technology (ICT) to support complex processes requires a profound understanding of the IE that underpins their execution. There is little existing research that investigates the complexity of IE in RACFs and its impact on ICT design. The aim of this study was thus to undertake an in-depth exploration of the IE process involved in medication management to identify its implications for the design of ICT. The study was undertaken at a large metropolitan facility in NSW, Australia. A total of three focus groups, eleven interviews and two observation sessions were conducted between July to August 2010. Process modelling was undertaken by translating the qualitative data via in-depth iterative inductive analysis. The findings highlight the complexity and collaborative nature of IE in RACF medication management. These models emphasize the need to: a) deal with temporal complexity; b) rely on an interdependent set of coordinative artefacts; and c) use synchronous communication channels for coordination. Taken together these are crucial aspects of the IE process in RACF medication management that need to be catered for when designing ICT in this critical area. This study provides important new evidence of the advantages of viewing process as a part of a system rather than as segregated tasks as a means of identifying the latent requirements for ICT design and that is able to support complex collaborative processes like medication management in RACFs. © 2012 IEEE.
Resumo:
Background Poor clinical handover has been associated with inaccurate clinical assessment and diagnosis, delays in diagnosis and test ordering, medication errors and decreased patient satisfaction in the acute care setting. Research on the handover process in the residential aged care sector is very limited. Purpose The aims of this study were to: (i) Develop an in-depth understanding of the handover process in aged care by mapping all the key activities and their information dynamics, (ii) Identify gaps in information exchange in the handover process and analyze implications for resident safety, (iii) Develop practical recommendations on how information communication technology (ICT) can improve the process and resident safety. Methods The study was undertaken at a large metropolitan facility in NSW with more than 300 residents and a staff including 55 registered nurses (RNs) and 146 assistants in nursing (AINs). A total of 3 focus groups, 12 interviews and 3 observation sessions were conducted over a period from July to October 2010. Process mapping was undertaken by translating the qualitative data via a five-category code book that was developed prior to the analysis. Results Three major sub-processes were identified and mapped. The three major stages are Handover process (HOP) I “Information gathering by RN”, HOP II “Preparation of preliminary handover sheet” and HOP III “Execution of handover meeting”. Inefficient processes were identified in relation to the handover including duplication of information, utilization of multiple communication modes and information sources, and lack of standardization. Conclusion By providing a robust process model of handover this study has made two critical contributions to research in aged care: (i) a means to identify important, possibly suboptimal practices; and (ii) valuable evidence to plan and improve ICT implementation in residential aged care. The mapping of this process enabled analysis of gaps in information flow and potential impacts on resident safety. In addition it offers the basis for further studies into a process that, despite its importance for securing resident safety and continuity of care, lacks research.
Resumo:
Emerging embedded applications are based on evolving standards (e.g., MPEG2/4, H.264/265, IEEE802.11a/b/g/n). Since most of these applications run on handheld devices, there is an increasing need for a single chip solution that can dynamically interoperate between different standards and their derivatives. In order to achieve high resource utilization and low power dissipation, we propose REDEFINE, a polymorphic ASIC in which specialized hardware units are replaced with basic hardware units that can create the same functionality by runtime re-composition. It is a ``future-proof'' custom hardware solution for multiple applications and their derivatives in a domain. In this article, we describe a compiler framework and supporting hardware comprising compute, storage, and communication resources. Applications described in high-level language (e.g., C) are compiled into application substructures. For each application substructure, a set of compute elements on the hardware are interconnected during runtime to form a pattern that closely matches the communication pattern of that particular application. The advantage is that the bounded CEs are neither processor cores nor logic elements as in FPGAs. Hence, REDEFINE offers the power and performance advantage of an ASIC and the hardware reconfigurability and programmability of that of an FPGA/instruction set processor. In addition, the hardware supports custom instruction pipelining. Existing instruction-set extensible processors determine a sequence of instructions that repeatedly occur within the application to create custom instructions at design time to speed up the execution of this sequence. We extend this scheme further, where a kernel is compiled into custom instructions that bear strong producer-consumer relationship (and not limited to frequently occurring sequences of instructions). Custom instructions, realized as hardware compositions effected at runtime, allow several instances of the same to be active in parallel. A key distinguishing factor in majority of the emerging embedded applications is stream processing. To reduce the overheads of data transfer between custom instructions, direct communication paths are employed among custom instructions. In this article, we present the overview of the hardware-aware compiler framework, which determines the NoC-aware schedule of transports of the data exchanged between the custom instructions on the interconnect. The results for the FFT kernel indicate a 25% reduction in the number of loads/stores, and throughput improves by log(n) for n-point FFT when compared to sequential implementation. Overall, REDEFINE offers flexibility and a runtime reconfigurability at the expense of 1.16x in power and 8x in area when compared to an ASIC. REDEFINE implementation consumes 0.1x the power of an FPGA implementation. In addition, the configuration overhead of the FPGA implementation is 1,000x more than that of REDEFINE.
Resumo:
Biosensors have gained immense acceptance in the field of medical diagnostics, besides environmental, food safety and biodefence applications due to its attributes of real-time and rapid response. This synergistic combination of biotechnology and microelectronics comprises a biological recognition element coupled with a compatible transducer device. Diabetes is a disease of major concern since the ratio of world population suffering from it is increasing at an alarming rate and therefore the need for development of accurate and stable glucose biosensors is evident. There are many commercial glucose biosensors available yet some limitations need attention. This review presents a detailed account of the polypyrrole based amperometric glucose biosensors. The polymer polypyrrole is used extensively as a matrix for immobilization of glucose oxidase enzyme owing to its favourable features such as stability under ambient conditions, conductivity that allows it to be used as an electron relay, ability to be polymerized under neutral and aqueous mild conditions, and more. The simple one-step electrodeposition on the electrode surface allows easy entrapment of the enzyme. The review is structured into three categories (a) the first-stage biosensors: which report the studies from the inception of use of polypyrrole in glucose biosensors during which time the role of the polymer and the use of mediators was established. This period saw extensive work by two separate groups of Schuhmann and Koopal who contributed a great deal in understanding the electron transfer pathways in polypyrrole based glucose biosensors, (b) the second-stage biosensors: which highlight the shift of polypyrrole from a conventional matrix to composite matrices with extensive use of mediators focused at improving the selectivity of response, and (c) third-stage biosensors: the remarkable properties of nanoparticles and carbon nanotubes and their outstanding ability to mediate electrontransfers have seen their indispensable use in conjugation with polypyrrole for development of glucose biosensors with improved sensitivity and stability characteristics which is accounted in the review, which thus traces the evolution of polypyrrole from a conventional matrix, to composites and thence to the form of nanotube arrays, with the objective of addressing the vital issue of diabetes management through the development of stable and reliable glucose biosensors.
Resumo:
Although various strategies have been developed for scheduling parallel applications with independent tasks, very little work exists for scheduling tightly coupled parallel applications on cluster environments. In this paper, we compare four different strategies based on performance models of tightly coupled parallel applications for scheduling the applications on clusters. In addition to algorithms based on existing popular optimization techniques, we also propose a new algorithm called Box Elimination that searches the space of performance model parameters to determine the best schedule of machines. By means of real and simulation experiments, we evaluated the algorithms on single cluster and multi-cluster setups. We show that our Box Elimination algorithm generates up to 80% more efficient schedule than other algorithms. We also show that the execution times of the schedules produced by our algorithm are more robust against the performance modeling errors.
Resumo:
An instrument for simultaneous measurement of dynamic strain and temperature in a thermally unstable ambience has been proposed, based on fiber Bragg grating technology. The instrument can function as a compact and stand-alone broadband thermometer and a dynamic strain gauge. It employs a source wavelength tracking procedure for linear dependence of the output on the measurand, offering high dynamic range. Two schemes have been demonstrated with their relative merits. As a thermometer, the present instrumental configuration can offer a linear response in excess of 500 degrees C that can be easily extended by adding a suitable grating and source without any alteration in the procedure. Temperature sensitivity is about 0.06 degrees C for a bandwidth of 1 Hz. For the current grating, the upper limit of strain measurement is about 150 mu epsilon with a sensitivity of about 80 n epsilon Hz(-1/2). The major source of uncertainty associated with dynamic strain measurement is the laser source intensity noise, which is of broad spectral band. A low noise source device or the use of optical power regulators can offer improved performance. The total harmonic distortion is less than 0.5% up to about 50 mu epsilon, 1.2% at 100 mu epsilon and about 2.3% at 150 mu epsilon. Calibrated results of temperature and strain measurement with the instrument have been presented. Traces of ultrasound signals recorded by the system at 200 kHz, in an ambience of 100-200 degrees C temperature fluctuation, have been included. Also, the vibration spectrum and engine temperature of a running internal combustion engine has been recorded as a realistic application of the system.
Resumo:
Security in a mobile communication environment is always a matter for concern, even after deploying many security techniques at device, network, and application levels. The end-to-end security for mobile applications can be made robust by developing dynamic schemes at application level which makes use of the existing security techniques varying in terms of space, time, and attacks complexities. In this paper we present a security techniques selection scheme for mobile transactions, called the Transactions-Based Security Scheme (TBSS). The TBSS uses intelligence to study, and analyzes the security implications of transactions under execution based on certain criterion such as user behaviors, transaction sensitivity levels, and credibility factors computed over the previous transactions by the users, network vulnerability, and device characteristics. The TBSS identifies a suitable level of security techniques from the repository, which consists of symmetric, and asymmetric types of security algorithms arranged in three complexity levels, covering various encryption/decryption techniques, digital signature schemes, andhashing techniques. From this identified level, one of the techniques is deployed randomly. The results shows that, there is a considerable reduction in security cost compared to static schemes, which employ pre-fixed security techniques to secure the transactions data.
Resumo:
The artists at Studio REV-, along with their allies in the broader non-profit sector, address domestic workers’ rights in the United States. As a social practice art project, NannyVan works to improve how information about domestic rights is disseminated to these workers, whether nannies, elder caregivers or others. As part of a larger project named CareForce, the NannyVan project shows an ethics of care by using design traces as tactics and transversal methods as strategies.
Resumo:
This chapter traces the history of evidence-based practice (EBP) from its roots in evidence-based medicine to contemporary thinking about its usefulness to public health practice. It defines EBP and differentiates it from ‘evidence-based medicine’, ‘evidence-based policy’ and ‘evidence-based healthcare’. As it is important to understand the subjective nature of knowledge and the research process, this chapter describes the nature and production of knowledge. This chapter considers the necessary skills for EBP, and the processes of attaining the necessary evidence. We examine the barriers and facilitators to identifying and implementing ‘best practice’, and when EBP is appropriate to use. There is a discussion about the limitations of EBP and the use of other information sources to guide practice, and concluding information about the application of evidence to guide policy and practice.
Resumo:
Previous studies have shown that buffering packets in DRAM is a performance bottleneck. In order to understand the impediments in accessing the DRAM, we developed a detailed Petri net model of IP forwarding application on IXP2400 that models the different levels of the memory hierarchy. The cell based interface used to receive and transmit packets in a network processor leads to some small size DRAM accesses. Such narrow accesses to the DRAM expose the bank access latency, reducing the bandwidth that can be realized. With real traces up to 30% of the accesses are smaller than the cell size, resulting in 7.7% reduction in DRAM bandwidth. To overcome this problem, we propose buffering these small chunks of data in the on chip scratchpad memory. This scheme also exploits greater degree of parallelism between different levels of the memory hierarchy. Using real traces from the internet, we show that the transmit rate can be improved by an average of 21% over the base scheme without the use of additional hardware. Further, the impact of different traffic patterns on the network processor resources is studied. Under real traffic conditions, we show that the data bus which connects the off-chip packet buffer to the micro-engines, is the obstacle in achieving higher throughput.
Resumo:
A major concern of embedded system architects is the design for low power. We address one aspect of the problem in this paper, namely the effect of executable code compression. There are two benefits of code compression – firstly, a reduction in the memory footprint of embedded software, and secondly, potential reduction in memory bus traffic and power consumption. Since decompression has to be performed at run time it is achieved by hardware. We describe a tool called COMPASS which can evaluate a range of strategies for any given set of benchmarks and display compression ratios. Also, given an execution trace, it can compute the effect on bus toggles, and cache misses for a range of compression strategies. The tool is interactive and allows the user to vary a set of parameters, and observe their effect on performance. We describe an implementation of the tool and demonstrate its effectiveness. To the best of our knowledge this is the first tool proposed for such a purpose.
Resumo:
Parallel programming and effective partitioning of applications for embedded many-core architectures requires optimization algorithms. However, these algorithms have to quickly evaluate thousands of different partitions. We present a fast performance estimator embedded in a parallelizing compiler for streaming applications. The estimator combines a single execution-based simulation and an analytic approach. Experimental results demonstrate that the estimator has a mean error of 2.6% and computes its estimation 2848 times faster compared to a cycle accurate simulator.
Resumo:
Emissions of coal combustion fly ash through real scale ElectroStatic Precipitators (ESP) were studied in different coal combustion and operation conditions. Sub-micron fly-ash aerosol emission from a power plant boiler and the ESP were determined and consequently the aerosol penetration, as based on electrical mobility measurements, thus giving thereby an indication for an estimate on the size and the maximum extent that the small particles can escape. The experimentals indicate a maximum penetration of 4% to 20 % of the small particles, as counted on number basis instead of the normally used mass basis, while simultaneously the ESP is operating at a nearly 100% collection efficiency on mass basis. Although the size range as such seems to appear independent of the coal, of the boiler or even of the device used for the emission control, the maximum penetration level on the number basis depends on the ESP operating parameters. The measured emissions were stable during stable boiler operation for a fired coal, and the emissions seemed each to be different indicating that the sub-micron size distribution of the fly-ash could be used as a specific characteristics for recognition, for instance for authenticity, provided with an indication of known stable operation. Consequently, the results on the emissions suggest an optimum particle size range for environmental monitoring in respect to the probability of finding traces from the samples. The current work embodies also an authentication system for aerosol samples for post-inspection from any macroscopic sample piece. The system can comprise newly introduced new devices, for mutually independent use, or, for use in a combination with each other, as arranged in order to promote the sampling operation length and/or the tag selection diversity. The tag for the samples can be based on naturally occurring measures and/or added measures of authenticity in a suitable combination. The method involves not only military related applications but those in civil industries as well. Alternatively to the samples, the system can be applied to ink for note printing or other monetary valued papers, but also in a filter manufacturing for marking fibrous filters.