969 resultados para indecomposable module
Resumo:
Aspect orientation is an important approach to address complexity of cross-cutting concerns in Information Systems. This approach encapsulates these concerns separately and compose them to the main module when needed. Although there a different works which shows how this separation should be performed in process models, the composition of them is an open area. In this paper, we demonstrate the semantics of a service which enables this composition. The result can also be used as a blueprint to implement the service to support aspect orientation in Business Process Management area.
Resumo:
The most common software analysis tools available for measuring fluorescence images are for two-dimensional (2D) data that rely on manual settings for inclusion and exclusion of data points, and computer-aided pattern recognition to support the interpretation and findings of the analysis. It has become increasingly important to be able to measure fluorescence images constructed from three-dimensional (3D) datasets in order to be able to capture the complexity of cellular dynamics and understand the basis of cellular plasticity within biological systems. Sophisticated microscopy instruments have permitted the visualization of 3D fluorescence images through the acquisition of multispectral fluorescence images and powerful analytical software that reconstructs the images from confocal stacks that then provide a 3D representation of the collected 2D images. Advanced design-based stereology methods have progressed from the approximation and assumptions of the original model-based stereology(1) even in complex tissue sections(2). Despite these scientific advances in microscopy, a need remains for an automated analytic method that fully exploits the intrinsic 3D data to allow for the analysis and quantification of the complex changes in cell morphology, protein localization and receptor trafficking. Current techniques available to quantify fluorescence images include Meta-Morph (Molecular Devices, Sunnyvale, CA) and Image J (NIH) which provide manual analysis. Imaris (Andor Technology, Belfast, Northern Ireland) software provides the feature MeasurementPro, which allows the manual creation of measurement points that can be placed in a volume image or drawn on a series of 2D slices to create a 3D object. This method is useful for single-click point measurements to measure a line distance between two objects or to create a polygon that encloses a region of interest, but it is difficult to apply to complex cellular network structures. Filament Tracer (Andor) allows automatic detection of the 3D neuronal filament-like however, this module has been developed to measure defined structures such as neurons, which are comprised of dendrites, axons and spines (tree-like structure). This module has been ingeniously utilized to make morphological measurements to non-neuronal cells(3), however, the output data provide information of an extended cellular network by using a software that depends on a defined cell shape rather than being an amorphous-shaped cellular model. To overcome the issue of analyzing amorphous-shaped cells and making the software more suitable to a biological application, Imaris developed Imaris Cell. This was a scientific project with the Eidgenössische Technische Hochschule, which has been developed to calculate the relationship between cells and organelles. While the software enables the detection of biological constraints, by forcing one nucleus per cell and using cell membranes to segment cells, it cannot be utilized to analyze fluorescence data that are not continuous because ideally it builds cell surface without void spaces. To our knowledge, at present no user-modifiable automated approach that provides morphometric information from 3D fluorescence images has been developed that achieves cellular spatial information of an undefined shape (Figure 1). We have developed an analytical platform using the Imaris core software module and Imaris XT interfaced to MATLAB (Mat Works, Inc.). These tools allow the 3D measurement of cells without a pre-defined shape and with inconsistent fluorescence network components. Furthermore, this method will allow researchers who have extended expertise in biological systems, but not familiarity to computer applications, to perform quantification of morphological changes in cell dynamics.
Resumo:
Effective risk management is crucial for any organisation. One of its key steps is risk identification, but few tools exist to support this process. Here we present a method for the automatic discovery of a particular type of process-related risk, the danger of deadline transgressions or overruns, based on the analysis of event logs. We define a set of time-related process risk indicators, i.e., patterns observable in event logs that highlight the likelihood of an overrun, and then show how instances of these patterns can be identified automatically using statistical principles. To demonstrate its feasibility, the approach has been implemented as a plug-in module to the process mining framework ProM and tested using an event log from a Dutch financial institution.
Resumo:
Food insecurity is the limited access to, or availability of, nutritious, culturally-appropriate and safe foods, or the inability to access these foods by socially acceptable means. In Australia, the monitoring of food insecurity is limited to the use of a single item, included in the three-yearly National Health Survey (NHS). The current research comprised a) a review of the literature and available tools to measure food security, b) piloting and adaptation of the more comprehensive 16-item United States Department of Agriculture (USDA) Food Security Survey Module (FSSM), and c) a cross-sectional study comparing this more comprehensive tool, and it’s 10- and 6- item short forms, with the current single-item used in the NHS, among a sample of households in disadvantaged urban-areas of Brisbane, Australia. Findings have shown that internationally the 16-item USDA-FSSM is the most widely used tool for the measurement of food insecurity. Furthermore, of the validated tools that exist to measure food insecurity, sensitivity and reliability decline as the number of questions in a tool decreases. Among an Australian sample, the current single-measure utilised in the NHS yielded a significantly lower prevalence for food insecurity compared to the 16-item USDA-FSSM and it’s two shorter forms respectively (four and two percentage points lower respectively). These findings suggest that the current prevalence of food insecurity (estimated at 6% in the most recent NHS) may have been underestimated, and have important implications for the development of an effective means of monitoring food security within the context of a developed country.
Resumo:
Objective: Food insecurity may be associated with a number of adverse health and social outcomes however our knowledge of its public health significance in Australia has been limited by use of a single-item measure in the Australian National Health Surveys (NHS) and, more recently, the exclusion of food security items from these surveys. The current study compares prevalence estimates of food insecurity in disadvantaged urban areas of Brisbane using the one-item NHS measure with three adaptations of the United States Department of Agriculture Food Security Survey Module (USDA-FSSM). Design: Data were collected by postal survey (n= 505, 53% response). Food security status was ascertained by the measure used in the NHS, and the 6-, 10- and 18-item versions of the USDA-FSSM. Demographic characteristics of the sample, prevalence estimates of food insecurity and different levels of food insecurity estimated by each tool were determined. Setting: Disadvantaged suburbs of Brisbane city, Australia, 2009. Subjects: Individuals aged ≥ 18 years. Results: Food insecurity was prevalent in socioeconomically-disadvantaged urban areas, estimated as 19.5% using the single-item NHS measure. This was significantly less than the 24.6% (P <0.01), 22.0% (P = 0.01) and 21.3% (P = 0.03) identified using the 18-item, 10-item and 6-item versions of the USDA-FSSM, respectively. The proportion of the sample reporting more severe levels of food insecurity were 10.7%, 10% and 8.6% for the 18-, 10- and 6-item USDA measures respectively, however this degree of food insecurity could not be ascertained using the NHS measure. Conclusions: The measure of food insecurity employed in the NHS may underestimate its prevalence and public health significance. Future monitoring and surveillance efforts should seek to employ a more accurate measure.
Resumo:
This tutorial is designed to help new users become familiar with using the PicoBlaze microcontroller with the Spartan-3E board. The tutorial gives a brief introduction to the PicoBlaze microcontroller, and then steps through the following: - Writing a small PicoBlaze assembly language (.psm) file, and stepping through the process of assembling the .psm file using KCPSM3; - Writing a top level VHDL module to connect the PicoBlaze microcontroller (KCPSM3 component) and the program ROM, and to connect the required input and output ports; - Connecting the top level module inputs and outputs to the switches, buttons and LEDs on the Spartan-3E board; - Downloading the program to the Spartan-3E board using the Project Navigator software.
Resumo:
Modern applications comprise multiple components, such as browser plug-ins, often of unknown provenance and quality. Statistics show that failure of such components accounts for a high percentage of software faults. Enabling isolation of such fine-grained components is therefore necessary to increase the robustness and resilience of security-critical and safety-critical computer systems. In this paper, we evaluate whether such fine-grained components can be sandboxed through the use of the hardware virtualization support available in modern Intel and AMD processors. We compare the performance and functionality of such an approach to two previous software based approaches. The results demonstrate that hardware isolation minimizes the difficulties encountered with software based approaches, while also reducing the size of the trusted computing base, thus increasing confidence in the solution's correctness. We also show that our relatively simple implementation has equivalent run-time performance, with overheads of less than 34%, does not require custom tool chains and provides enhanced functionality over software-only approaches, confirming that hardware virtualization technology is a viable mechanism for fine-grained component isolation.
Resumo:
The use of Trusted Platform Module (TPM) is be- coming increasingly popular in many security sys- tems. To access objects protected by TPM (such as cryptographic keys), several cryptographic proto- cols, such as the Object Specific Authorization Pro- tocol (OSAP), can be used. Given the sensitivity and the importance of those objects protected by TPM, the security of this protocol is vital. Formal meth- ods allow a precise and complete analysis of crypto- graphic protocols such that their security properties can be asserted with high assurance. Unfortunately, formal verification of these protocols are limited, de- spite the abundance of formal tools that one can use. In this paper, we demonstrate the use of Coloured Petri Nets (CPN) - a type of formal technique, to formally model the OSAP. Using this model, we then verify the authentication property of this protocol us- ing the state space analysis technique. The results of analysis demonstrates that as reported by Chen and Ryan the authentication property of OSAP can be violated.
Resumo:
The residence time distribution (RTD) is a crucial parameter when treating engine exhaust emissions with a Dielectric Barrier Discharge (DBD) reactor. In this paper, the residence time of such a reactor is investigated using a finite element based software: COMSOL Multiphysics 4.3. Non-thermal plasma (NTP) discharge is being introduced as a promising method for pollutant emission reduction. DBD is one of the most advantageous of NTP technologies. In a two cylinder co-axial DBD reactor, tubes are placed between two electrodes and flow passes through the annuals between these barrier tubes. If the mean residence time increases in a DBD reactor, there will be a corresponding increase in reaction time and consequently, the pollutant removal efficiency can increase. However, pollutant formation can occur during increased mean residence time and so the proportion of fluid that may remain for periods significantly longer than the mean residence time is of great importance. In this study, first, the residence time distribution is calculated based on the standard reactor used by the authors for ultrafine particle (10-500 nm) removal. Then, different geometrics and various inlet velocities are considered. Finally, for selected cases, some roughness elements added inside the reactor and the residence time is calculated. These results will form the basis for a COMSOL plasma and CFD module investigation.
Resumo:
In 2008 a move away from medical staff providing nursing education in Vietnam saw the employment of many new nurse academics. To assist in the instruction of these novice academics and provide them with sound teaching and learning practice as well as curriculum design and implementation skills, Queensland University of Technology (QUT) successfully tendered an international grant. One of QUT’s initiatives in educating the Vietnamese academics was a distance learning programme. Developed specifically for Vietnamese nurse academics, the programme was designed for Australian based delivery to academics in Vietnam. This paper will present an overview of why four separate modules were utilised for the delivery of content (modules were delivered at a rate of one per semester). It will address bilingual online discussion boards which were used in each of the modules and the process of moderating these given comments were posted in both Vietnamese and English. It will describe how content was scaffolded across four modules and how the modules themselves modelled new teaching delivery strategies. Lastly, it will discuss the considerations of programme delivery given the logistics of an Australian based delivery. Feedback from the Vietnamese nurse academics across their involvement in the programme (and at the conclusion of their fourth and final module) has been overwhelmingly positive. Feedback suggests the programme has altered teaching and assessment approaches used by some Vietnamese nurse academics. Additionally, Vietnamese nurse academics are reporting that they are engaging more with the application of their content indicating a cultural shift in the approach taken in Vietnamese nurse education.
Resumo:
Background: Extra corporeal membrane oxygenation (ECMO) is a complex rescue therapy used to provide cardiac and/or respiratory support for critically ill patients who have failed maximal conventional medical management. ECMO is based on a modified cardiopulmonary bypass (CPB) circuit, and can provide cardiopulmonary support for up-to several months. It can be used in a veno venous configuration for isolated respiratory failure, (VV-ECMO), or in a veno arterial configuration (VA-ECMO) where support is necessary for cardiac +/- respiratory failure. The ECMO circuit consists of five main components: large bore cannulae (access cannulae) for drainage of the venous system, and return cannulae to either the venous (in VV-ECMO) or arterial (in VA ECMO) system. An oxygenator, with a vast surface area of hollow filaments, allows addition of oxygen and removal of carbon dioxide; a centrifugal blood pump allows propulsion of blood through the circuit at upto 10 L/minute; a control module and a thermoregulatory unit, which allows for exact temperature control of the extra corporeal blood. Methods: The first successful use of ECMO for ARDS in adults occurred in 1972, and its use has become more commonplace over the last 30 years, supported by the improvement in design and biocompatibility of the equipment, which has reduced the morbidity associated with this modality. Whilst the use of ECMO in neonatal population has been supported by numerous studies, the evidence upon which ECMO was integrated into adult practice was substantially less robust. Results: Recent data, including the CESAR study (Conventional Ventilatory Support versus Extra corporeal membrane oxygenation for Severe Respiratory failure) has added a degree of evidence to the role of ECMO in such a patient population. The CESAR study analysed 180 patients, and confirmed that ECMO was associated with an improved rate of survival. More recently, ECMO has been utilized in numerous situations within the critical care area, including support in high-risk percutaneous interventions in cardiac catheter lab; the operating room, emergency department, as well in specialized inter-hospital retrieval services. The increased understanding of the risk:benefit profile of ECMO, along with a reduction in morbidity associated with its use will doubtless lead to a substantial rise in the utilisation of this modality. As with all extra-corporeal circuits, ECMO opposes the basic premises of the mammalian inflammation and coagulation cascade where blood comes into foreign circulation, both these cascades are activated. Anti-coagulation is readily dealt with through use of agents such as heparin, but the inflammatory excess, whilst less macroscopically obvious, continues un-abated. Platelet consumption and neutrophil activation occur rapidly, and the clinician is faced with balancing the need of anticoagulation for the circuit, against haemostasis in an acutely bleeding patient. Alterations in pharmacokinetics may result in inadequate levels of disease modifying therapeutics, such as antibiotics, hence paradoxically delaying recovery from conditions such as pneumonia. Key elements of nutrition and the innate immune system maysimilarly be affected. Summary: This presentation will discuss the basic features of ECMO to the nonspecialist, and review the clinical conundrum faced by the team treating these most complex cases.
Resumo:
Our daily lives become more and more dependent upon smartphones due to their increased capabilities. Smartphones are used in various ways, e.g. for payment systems or assisting the lives of elderly or disabled people. Security threats for these devices become more and more dangerous since there is still a lack of proper security tools for protection. Android emerges as an open smartphone platform which allows modification even on operating system level and where third-party developers first time have the opportunity to develop kernel-based low-level security tools. Android quickly gained its popularity among smartphone developers and even beyond since it bases on Java on top of "open" Linux in comparison to former proprietary platforms which have very restrictive SDKs and corresponding APIs. Symbian OS, holding the greatest market share among all smartphone OSs, was even closing critical APIs to common developers and introduced application certification. This was done since this OS was the main target for smartphone malwares in the past. In fact, more than 290 malwares designed for Symbian OS appeared from July 2004 to July 2008. Android, in turn, promises to be completely open source. Together with the Linux-based smartphone OS OpenMoko, open smartphone platforms may attract malware writers for creating malicious applications endangering the critical smartphone applications and owners privacy. Since signature-based approaches mainly detect known malwares, anomaly-based approaches can be a valuable addition to these systems. They base on mathematical algorithms processing data that describe the state of a certain device. For gaining this data, a monitoring client is needed that has to extract usable information (features) from the monitored system. Our approach follows a dual system for analyzing these features. On the one hand, functionality for on-device light-weight detection is provided. But since most algorithms are resource exhaustive, remote feature analysis is provided on the other hand. Having this dual system enables event-based detection that can react to the current detection need. In our ongoing research we aim to investigates the feasibility of light-weight on-device detection for certain occasions. On other occasions, whenever significant changes are detected on the device, the system can trigger remote detection with heavy-weight algorithms for better detection results. In the absence of the server respectively as a supplementary approach, we also consider a collaborative scenario. Here, mobile devices sharing a common objective are enabled by a collaboration module to share information, such as intrusion detection data and results. This is based on an ad-hoc network mode that can be provided by a WiFi or Bluetooth adapter nearly every smartphone possesses.
Resumo:
Advances in solid-state switches and power electronics techniques have led to the development of compact, efficient and more reliable pulsed power systems. Although, the power rating and operation speed of the new solid-state switches are considerably increased, their low blocking voltage level puts a limits in the pulsed power operation. This paper proposes the advantage of parallel and series configurations of pulsed power modules in obtaining high voltage levels with fast rise time (dv/dt) using only conventional switches. The proposed configuration is based on two flyback modules. The effectiveness of the proposed approach is verified by numerical simulations, and the advantages of each configuration are indicated in comparison with a single module.
Resumo:
For the timber industry, the ability to simulate the drying of wood is invaluable for manufacturing high quality wood products. Mathematically, however, modelling the drying of a wet porous material, such as wood, is a diffcult task due to its heterogeneous and anisotropic nature, and the complex geometry of the underlying pore structure. The well{ developed macroscopic modelling approach involves writing down classical conservation equations at a length scale where physical quantities (e.g., porosity) can be interpreted as averaged values over a small volume (typically containing hundreds or thousands of pores). This averaging procedure produces balance equations that resemble those of a continuum with the exception that effective coeffcients appear in their deffnitions. Exponential integrators are numerical schemes for initial value problems involving a system of ordinary differential equations. These methods differ from popular Newton{Krylov implicit methods (i.e., those based on the backward differentiation formulae (BDF)) in that they do not require the solution of a system of nonlinear equations at each time step but rather they require computation of matrix{vector products involving the exponential of the Jacobian matrix. Although originally appearing in the 1960s, exponential integrators have recently experienced a resurgence in interest due to a greater undertaking of research in Krylov subspace methods for matrix function approximation. One of the simplest examples of an exponential integrator is the exponential Euler method (EEM), which requires, at each time step, approximation of φ(A)b, where φ(z) = (ez - 1)/z, A E Rnxn and b E Rn. For drying in porous media, the most comprehensive macroscopic formulation is TransPore [Perre and Turner, Chem. Eng. J., 86: 117-131, 2002], which features three coupled, nonlinear partial differential equations. The focus of the first part of this thesis is the use of the exponential Euler method (EEM) for performing the time integration of the macroscopic set of equations featured in TransPore. In particular, a new variable{ stepsize algorithm for EEM is presented within a Krylov subspace framework, which allows control of the error during the integration process. The performance of the new algorithm highlights the great potential of exponential integrators not only for drying applications but across all disciplines of transport phenomena. For example, when applied to well{ known benchmark problems involving single{phase liquid ow in heterogeneous soils, the proposed algorithm requires half the number of function evaluations than that required for an equivalent (sophisticated) Newton{Krylov BDF implementation. Furthermore for all drying configurations tested, the new algorithm always produces, in less computational time, a solution of higher accuracy than the existing backward Euler module featured in TransPore. Some new results relating to Krylov subspace approximation of '(A)b are also developed in this thesis. Most notably, an alternative derivation of the approximation error estimate of Hochbruck, Lubich and Selhofer [SIAM J. Sci. Comput., 19(5): 1552{1574, 1998] is provided, which reveals why it performs well in the error control procedure. Two of the main drawbacks of the macroscopic approach outlined above include the effective coefficients must be supplied to the model, and it fails for some drying configurations, where typical dual{scale mechanisms occur. In the second part of this thesis, a new dual{scale approach for simulating wood drying is proposed that couples the porous medium (macroscale) with the underlying pore structure (microscale). The proposed model is applied to the convective drying of softwood at low temperatures and is valid in the so{called hygroscopic range, where hygroscopically held liquid water is present in the solid phase and water exits only as vapour in the pores. Coupling between scales is achieved by imposing the macroscopic gradient on the microscopic field using suitably defined periodic boundary conditions, which allows the macroscopic ux to be defined as an average of the microscopic ux over the unit cell. This formulation provides a first step for moving from the macroscopic formulation featured in TransPore to a comprehensive dual{scale formulation capable of addressing any drying configuration. Simulation results reported for a sample of spruce highlight the potential and flexibility of the new dual{scale approach. In particular, for a given unit cell configuration it is not necessary to supply the effective coefficients prior to each simulation.
Resumo:
This paper proposes a unique and innovative approach to integrate transit signal priority control into a traffic adaptive signal control strategy. The proposed strategy was named OSTRAC (Optimized Strategy for integrated TRAffic and TRAnsit signal Control). The cornerstones of OSTRAC include an online microscopic traffic f low prediction model and a Genetic Algorithm (GA) based traffic signal timing module. A sensitivity analysis was conducted to determine the critical GA parameters. The developed traffic f low model demonstrated reliable prediction results through a test. OSTRAC was evaluated by comparing its performance to three other signal control strategies. The evaluation results revealed that OSTRAC efficiently and effectively reduced delay time of general traffic and also transit vehicles.