986 resultados para Sievert Pressione Assorbimento Desorbimento Idrogeno Volume Software Cinetica PCI Composizione
Resumo:
A vertex-centred finite volume method (FVM) for the Cahn-Hilliard (CH) and recently proposed Cahn-Hilliard-reaction (CHR) equations is presented. Information at control volume faces is computed using a high-order least-squares approach based on Taylor series approximations. This least-squares problem explicitly includes the variational boundary condition (VBC) that ensures that the discrete equations satisfy all of the boundary conditions. We use this approach to solve the CH and CHR equations in one and two dimensions and show that our scheme satisfies the VBC to at least second order. For the CH equation we show evidence of conservative, gradient stable solutions, however for the CHR equation, strict gradient-stability is more challenging to achieve.
Resumo:
OBJECTIVE: : Acute traumatic coagulopathy occurs early in hemorrhagic trauma and is a major contributor to mortality and morbidity. Our aim was to examine the effect of small-volume 7.5% NaCl adenocaine (adenosine and lidocaine, adenocaine) and Mg on hypotensive resuscitation and coagulopathy in the rat model of severe hemorrhagic shock. DESIGN: : Prospective randomized laboratory investigation. SUBJECTS: : A total of 68 male Sprague Dawley Rats. INTERVENTION: : Post-hemorrhagic shock treatment for acute traumatic coagulopathy. MEASUREMENTS AND METHODS: : Nonheparinized male Sprague-Dawley rats (300-450 g, n = 68) were randomly assigned to either: 1) untreated; 2) 7.5% NaCl; 3) 7.5% NaCl adenocaine; 4) 7.5% NaCl Mg; or 5) 7.5% NaCl adenocaine/Mg. Hemorrhagic shock was induced by phlebotomy to mean arterial pressure of 35-40 mm Hg for 20 mins (~40% blood loss), and animals were left in shock for 60 mins. Bolus (0.3 mL) was injected into the femoral vein and hemodynamics monitored. Blood was collected in Na citrate (3.2%) tubes, centrifuged, and the plasma snap frozen in liquid N2 and stored at -80°C. Coagulation was assessed using activated partial thromboplastin times and prothrombin times. RESULTS: : Small-volume 7.5% NaCl adenocaine and 7.5% NaCl adenocaine/Mg were the only two groups that gradually increased mean arterial pressure 1.6-fold from 38-39 mm Hg to 52 and 64 mm Hg, respectively, at 60 mins (p < .05). Baseline plasma activated partial thromboplastin time was 17 ± 0.5 secs and increased to 63 ± 21 secs after bleeding time, and 217 ± 32 secs after 60-min shock. At 60-min resuscitation, activated partial thromboplastin time values for untreated, 7.5% NaCl, 7.5% NaCl/Mg, and 7.5% NaCl adenocaine rats were 269 ± 31 secs, 262 ± 38 secs, 150 ± 43 secs, and 244 ± 38 secs, respectively. In contrast, activated partial thromboplastin time for 7.5% NaCl adenocaine/Mg was 24 ± 2 secs (p < .05). Baseline prothrombin time was 28 ± 0.8 secs (n = 8) and followed a similar pattern of correction. CONCLUSIONS: : Plasma activated partial thromboplastin time and prothrombin time increased over 10-fold during the bleed and shock periods prior to resuscitation, and a small-volume (~1 mL/kg) IV bolus of 7.5% NaCl AL/Mg was the only treatment group that raised mean arterial pressure into the permissive range and returned activated partial thromboplastin time and prothrombin time clotting times to baseline at 60 mins.
Resumo:
The most common software analysis tools available for measuring fluorescence images are for two-dimensional (2D) data that rely on manual settings for inclusion and exclusion of data points, and computer-aided pattern recognition to support the interpretation and findings of the analysis. It has become increasingly important to be able to measure fluorescence images constructed from three-dimensional (3D) datasets in order to be able to capture the complexity of cellular dynamics and understand the basis of cellular plasticity within biological systems. Sophisticated microscopy instruments have permitted the visualization of 3D fluorescence images through the acquisition of multispectral fluorescence images and powerful analytical software that reconstructs the images from confocal stacks that then provide a 3D representation of the collected 2D images. Advanced design-based stereology methods have progressed from the approximation and assumptions of the original model-based stereology(1) even in complex tissue sections(2). Despite these scientific advances in microscopy, a need remains for an automated analytic method that fully exploits the intrinsic 3D data to allow for the analysis and quantification of the complex changes in cell morphology, protein localization and receptor trafficking. Current techniques available to quantify fluorescence images include Meta-Morph (Molecular Devices, Sunnyvale, CA) and Image J (NIH) which provide manual analysis. Imaris (Andor Technology, Belfast, Northern Ireland) software provides the feature MeasurementPro, which allows the manual creation of measurement points that can be placed in a volume image or drawn on a series of 2D slices to create a 3D object. This method is useful for single-click point measurements to measure a line distance between two objects or to create a polygon that encloses a region of interest, but it is difficult to apply to complex cellular network structures. Filament Tracer (Andor) allows automatic detection of the 3D neuronal filament-like however, this module has been developed to measure defined structures such as neurons, which are comprised of dendrites, axons and spines (tree-like structure). This module has been ingeniously utilized to make morphological measurements to non-neuronal cells(3), however, the output data provide information of an extended cellular network by using a software that depends on a defined cell shape rather than being an amorphous-shaped cellular model. To overcome the issue of analyzing amorphous-shaped cells and making the software more suitable to a biological application, Imaris developed Imaris Cell. This was a scientific project with the Eidgenössische Technische Hochschule, which has been developed to calculate the relationship between cells and organelles. While the software enables the detection of biological constraints, by forcing one nucleus per cell and using cell membranes to segment cells, it cannot be utilized to analyze fluorescence data that are not continuous because ideally it builds cell surface without void spaces. To our knowledge, at present no user-modifiable automated approach that provides morphometric information from 3D fluorescence images has been developed that achieves cellular spatial information of an undefined shape (Figure 1). We have developed an analytical platform using the Imaris core software module and Imaris XT interfaced to MATLAB (Mat Works, Inc.). These tools allow the 3D measurement of cells without a pre-defined shape and with inconsistent fluorescence network components. Furthermore, this method will allow researchers who have extended expertise in biological systems, but not familiarity to computer applications, to perform quantification of morphological changes in cell dynamics.
Resumo:
This is volume 1 in a series of four volumes about the origins of Australian football as it evolved in Victoria between 1858 and 1896. This volume addresses its very beginnings as an amateur sport and the rise of the first clubs. Invented by a group of Melbourne cricketers and sports enthusiasts, Australian Rules football was developed through games played on Melbourne's park lands and was originally known as "Melbourne Football Club Rules". This formative period of the game saw the birth of the first 'amateur heroes' of the game. Players such as T.W. Wills, H.C.A. Harrison, Jack Conway, George O'Mullane and Robert Murray Smith emerged as warriors engaged in individual rugby-type scrimmages. The introduction of Challenge Cups was an important spur for this burgeoning sport. Intense competition and growing rivalries between clubs such as Melbourne, South Yarra, Royal Park, and Geelong began to flourish and the game developed as a result. By the 1870s the game "Victorian Rules" had become the most popular outdoor winter sport across the state. In subsequent decades, rapid growth in club football occurred and the game attracted increasing media attention.
Resumo:
Purpose: To assess the effects of pre-cooling volume on neuromuscular function and performance in free-paced intermittent-sprint exercise in the heat. Methods: Ten male, teamsport athletes completed four randomized trials involving an 85-min free-paced intermittentsprint exercise protocol in 33°C±33% relative humidity. Pre-cooling sessions included whole body (WB), head+hand (HH), head (H) and no cooling (CONT), applied for 20-min pre-exercise and 5-min mid exercise. Maximal voluntary contractions (MVC) were assessed pre- and postintervention and mid- and post-exercise. Exercise performance was assessed with sprint times, % decline and distances covered during free-paced bouts. Measures of core(Tc) and skin (Tsk) temperatures, heart rate, perceptual exertion and thermal stress were monitored throughout. Venous and capillary blood was analyzed for metabolite, muscle damage and inflammatory markers. Results: WB pre-cooling facilitated the maintenance of sprint times during the exercise protocol with reduced % decline (P=0.04). Mean and total hard running distances increased with pre cooling 12% compared to CONT (P<0.05), specifically, WB was 6-7% greater than HH (P=0.02) and H (P=0.001) respectively. No change was evident in mean voluntary or evoked force pre- to post-exercise with WB and HH cooling (P>0.05). WB and HH cooling reduced Tc by 0.1-0.3°C compared to other conditions (P<0.05). WB Tsk was suppressed for the entire session(P=0.001). HR responses following WB cooling were reduced(P=0.05; d=1.07) compared to CONT conditions during exercise. Conclusion: A relationship between pre-cooling volume and exercise performance seems apparent, as larger surface area coverage augmented subsequent free-paced exercise capacity, in conjunction with greater suppression of physiological load. Maintenance of MVC with pre-cooling, despite increased work output suggests the role of centrally-mediated mechanisms in exercise pacing regulation and subsequent performance.
Resumo:
Purpose Endotracheal suctioning causes significant lung derecruitment. Closed suction (CS) minimizes lung volume loss during suction, and therefore, volumes are presumed to recover more quickly postsuctioning. Conflicting evidence exists regarding this. We examined the effects of open suction (OS) and CS on lung volume loss during suctioning, and recovery of end-expiratory lung volume (EELV) up to 30 minutes postsuction. Material and Methods Randomized crossover study examining 20 patients postcardiac surgery. CS and OS were performed in random order, 30 minutes apart. Lung impedance was measured during suction, and end-expiratory lung impedance was measured at baseline and postsuctioning using electrical impedance tomography. Oximetry, partial pressure of oxygen in the alveoli/fraction of inspired oxygen ratio and compliance were collected. Results Reductions in lung impedance during suctioning were less for CS than for OS (mean difference, − 905 impedance units; 95% confidence interval [CI], − 1234 to –587; P < .001). However, at all points postsuctioning, EELV recovered more slowly after CS than after OS. There were no statistically significant differences in the other respiratory parameters. Conclusions Closed suctioning minimized lung volume loss during suctioning but, counterintuitively, resulted in slower recovery of EELV postsuction compared with OS. Therefore, the use of CS cannot be assumed to be protective of lung volumes postsuctioning. Consideration should be given to restoring EELV after either suction method via a recruitment maneuver.
Resumo:
Conference curatorial outline The focus of this symposium was to question whether interior design is changing relative to local conditions, and the effect globalization has on the performance of regional, particularly Southern hemisphere identities. The intention being to understand how theory and practice is transposed to ‘distant lands’, and how ideas shift from one place to another. To this extent the symposium invited papers on the export, translation and adoption of theories and practices of interior design to differing climates, cultures, and landscapes. This process, sometimes referred to as a shift from ‘the centre to the margins’, seeks new perspectives on the adoption of European and US design ideas abroad, as well as their return to their place of origin. Papers were invited from a range of perspectives including the export of ideas/attitudes to interior spaces, history of interior spaces abroad, and the adoption of ideas/processes to new conditions. Paralleling this trafficking of ideas are broader observations about interior space that emerge through specificity of place. These include new and emerging directions and differences in our understanding of interiority; both real and virtual, and an ever-changing relationship to city, suburb and country. Keeping within the Symposium theme the intention was to examine other places, particularly on the margins of the discipline’s domain. Semantic slippage aside, there are a range of approaches that engage outside events and practices enabling a transdisciplinary practice that draws from other philosophical and theoretical frameworks. Moreover as the field expands and new territories are opened up, the virtual worlds of computer gaming, animations, and interactive environments, both rely on and produce new forms of expression. This raises questions about the extent such spaces adopt or translate existing theory and practice, that is the transposition from one area to another and their return to the discipline.
Resumo:
We consider the space fractional advection–dispersion equation, which is obtained from the classical advection–diffusion equation by replacing the spatial derivatives with a generalised derivative of fractional order. We derive a finite volume method that utilises fractionally-shifted Grünwald formulae for the discretisation of the fractional derivative, to numerically solve the equation on a finite domain with homogeneous Dirichlet boundary conditions. We prove that the method is stable and convergent when coupled with an implicit timestepping strategy. Results of numerical experiments are presented that support the theoretical analysis.
Resumo:
In this study, natural convection heat transfer and buoyancy driven flows have been investigated in a right angled triangular enclosure. The heater located on the bottom wall while the inclined wall is colder and the remaining walls are maintained as adiabatic. Governing equations of natural convection are solved through the finite volume approach, in which buoyancy is modeled via the Boussinesq approximation. Effects of different parameters such as Rayleigh number, aspect ratio, prantdl number and heater location are considered. Results show that heat transfer increases when the heater is moved toward the right corner of the enclosure. It is also revealed that increasing the Rayleigh number, increases the strength of free convection regime and consequently increases the value of heat transfer rate. Moreover, larger aspect ratio enclosure has larger Nusselt number value. In order to have better insight, streamline and isotherms are shown.
Resumo:
The R statistical environment and language has demonstrated particular strengths for interactive development of statistical algorithms, as well as data modelling and visualisation. Its current implementation has an interpreter at its core which may result in a performance penalty in comparison to directly executing user algorithms in the native machine code of the host CPU. In contrast, the C++ language has no built-in visualisation capabilities, handling of linear algebra or even basic statistical algorithms; however, user programs are converted to high-performance machine code, ahead of execution. A new method avoids possible speed penalties in R by using the Rcpp extension package in conjunction with the Armadillo C++ matrix library. In addition to the inherent performance advantages of compiled code, Armadillo provides an easy-to-use template-based meta-programming framework, allowing the automatic pooling of several linear algebra operations into one, which in turn can lead to further speedups. With the aid of Rcpp and Armadillo, conversion of linear algebra centered algorithms from R to C++ becomes straightforward. The algorithms retains the overall structure as well as readability, all while maintaining a bidirectional link with the host R environment. Empirical timing comparisons of R and C++ implementations of a Kalman filtering algorithm indicate a speedup of several orders of magnitude.
Resumo:
Whole-body computer control interfaces present new opportunities to engage children with games for learning. Stomp is a suite of educational games that use such a technology, allowing young children to use their whole body to interact with a digital environment projected on the floor. To maximise the effectiveness of this technology, tenets of self-determination theory (SDT) are applied to the design of Stomp experiences. By meeting user needs for competence, autonomy, and relatedness our aim is to increase children's engagement with the Stomp learning platform. Analysis of Stomp's design suggests that these tenets are met. Observations from a case study of Stomp being used by young children show that they were highly engaged and motivated by Stomp. This analysis demonstrates that continued application of SDT to Stomp will further enhance user engagement. It also is suggested that SDT, when applied more widely to other whole-body multi-user interfaces, could instil similar positive effects.
Resumo:
Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the “gold standard” for predicting dose deposition in the patient. In this study, software has been developed that enables the transfer of treatment plan information from the treatment planning system to a Monte Carlo dose calculation engine. A database of commissioned linear accelerator models (Elekta Precise and Varian 2100CD at various energies) has been developed using the EGSnrc/BEAMnrc Monte Carlo suite. Planned beam descriptions and CT images can be exported from the treatment planning system using the DICOM framework. The information in these files is combined with an appropriate linear accelerator model to allow the accurate calculation of the radiation field incident on a modelled patient geometry. The Monte Carlo dose calculation results are combined according to the monitor units specified in the exported plan. The result is a 3D dose distribution that could be used to verify treatment planning system calculations. The software, MCDTK (Monte Carlo Dicom ToolKit), has been developed in the Java programming language and produces BEAMnrc and DOSXYZnrc input files, ready for submission on a high-performance computing cluster. The code has been tested with the Eclipse (Varian Medical Systems), Oncentra MasterPlan (Nucletron B.V.) and Pinnacle3 (Philips Medical Systems) planning systems. In this study the software was validated against measurements in homogenous and heterogeneous phantoms. Monte Carlo models are commissioned through comparison with quality assurance measurements made using a large square field incident on a homogenous volume of water. This study aims to provide a valuable confirmation that Monte Carlo calculations match experimental measurements for complex fields and heterogeneous media.
Resumo:
Smartphones are steadily gaining popularity, creating new application areas as their capabilities increase in terms of computational power, sensors and communication. Emerging new features of mobile devices give opportunity to new threats. Android is one of the newer operating systems targeting smartphones. While being based on a Linux kernel, Android has unique properties and specific limitations due to its mobile nature. This makes it harder to detect and react upon malware attacks if using conventional techniques. In this paper, we propose an Android Application Sandbox (AASandbox) which is able to perform both static and dynamic analysis on Android programs to automatically detect suspicious applications. Static analysis scans the software for malicious patterns without installing it. Dynamic analysis executes the application in a fully isolated environment, i.e. sandbox, which intervenes and logs low-level interactions with the system for further analysis. Both the sandbox and the detection algorithms can be deployed in the cloud, providing a fast and distributed detection of suspicious software in a mobile software store akin to Google's Android Market. Additionally, AASandbox might be used to improve the efficiency of classical anti-virus applications available for the Android operating system.