934 resultados para Data anonymization and sanitization
Resumo:
One of the challenges in scientific visualization is to generate software libraries suitable for the large-scale data emerging from tera-scale simulations and instruments. We describe the efforts currently under way at SDSC and NPACI to address these challenges. The scope of the SDSC project spans data handling, graphics, visualization, and scientific application domains. Components of the research focus on the following areas: intelligent data storage, layout and handling, using an associated “Floor-Plan” (meta data); performance optimization on parallel architectures; extension of SDSC’s scalable, parallel, direct volume renderer to allow perspective viewing; and interactive rendering of fractional images (“imagelets”), which facilitates the examination of large datasets. These concepts are coordinated within a data-visualization pipeline, which operates on component data blocks sized to fit within the available computing resources. A key feature of the scheme is that the meta data, which tag the data blocks, can be propagated and applied consistently. This is possible at the disk level, in distributing the computations across parallel processors; in “imagelet” composition; and in feature tagging. The work reflects the emerging challenges and opportunities presented by the ongoing progress in high-performance computing (HPC) and the deployment of the data, computational, and visualization Grids.
Resumo:
This study describes the pedagogical impact of real-world experimental projects undertaken as part of an advanced undergraduate Fluid Mechanics subject at an Australian university. The projects have been organised to complement traditional lectures and introduce students to the challenges of professional design, physical modelling, data collection and analysis. The physical model studies combine experimental, analytical and numerical work in order to develop students’ abilities to tackle real-world problems. A first study illustrates the differences between ideal and real fluid flow force predictions based upon model tests of buildings in a large size wind tunnel used for research and professional testing. A second study introduces the complexity arising from unsteady non-uniform wave loading on a sheltered pile. The teaching initiative is supported by feedback from undergraduate students. The pedagogy of the course and projects is discussed with reference to experiential, project-based and collaborative learning. The practical work complements traditional lectures and tutorials, and provides opportunities which cannot be learnt in the classroom, real or virtual. Student feedback demonstrates a strong interest for the project phases of the course. This was associated with greater motivation for the course, leading in turn to lower failure rates. In terms of learning outcomes, the primary aim is to enable students to deliver a professional report as the final product, where physical model data are compared to ideal-fluid flow calculations and real-fluid flow analyses. Thus the students are exposed to a professional design approach involving a high level of expertise in fluid mechanics, with sufficient academic guidance to achieve carefully defined learning goals, while retaining sufficient flexibility for students to construct there own learning goals. The overall pedagogy is a blend of problem-based and project-based learning, which reflects academic research and professional practice. The assessment is a mix of peer-assessed oral presentations and written reports that aims to maximise student reflection and development. Student feedback indicated a strong motivation for courses that include a well-designed project component.
Resumo:
Every day trillions of dollars circulate the globe in a digital data space and new forms of property and ownership emerge. Massive corporate entities with a global reach are formed and disappear with breathtaking speed, making and breaking personal fortunes the size of which defy imagination. Fictitious commodities abound. The genomes of entire nations have become corporately owned. Relationships have become the overt basis of economic wealth and political power. Hypercapitalism explores the problems of understanding this emergent form of global political economic organization by focusing on the internal relations between language, new media networks, and social perceptions of value. Taking an historical approach informed by Marx, Phil Graham draws upon writings in political economy, media studies, sociolinguistics, anthropology, and critical social science to understand the development, roots, and trajectory of the global system in which every possible aspect of human existence, including imagined futures, has become a commodity form.
Resumo:
The use of computational fluid dynamics simulations for calibrating a flush air data system is described, In particular, the flush air data system of the HYFLEX hypersonic vehicle is used as a case study. The HYFLEX air data system consists of nine pressure ports located flush with the vehicle nose surface, connected to onboard pressure transducers, After appropriate processing, surface pressure measurements can he converted into useful air data parameters. The processing algorithm requires an accurate pressure model, which relates air data parameters to the measured pressures. In the past, such pressure models have been calibrated using combinations of flight data, ground-based experimental results, and numerical simulation. We perform a calibration of the HYFLEX flush air data system using computational fluid dynamics simulations exclusively, The simulations are used to build an empirical pressure model that accurately describes the HYFLEX nose pressure distribution ol cr a range of flight conditions. We believe that computational fluid dynamics provides a quick and inexpensive way to calibrate the air data system and is applicable to a broad range of flight conditions, When tested with HYFLEX flight data, the calibrated system is found to work well. It predicts vehicle angle of attack and angle of sideslip to accuracy levels that generally satisfy flight control requirements. Dynamic pressure is predicted to within the resolution of the onboard inertial measurement unit. We find that wind-tunnel experiments and flight data are not necessary to accurately calibrate the HYFLEX flush air data system for hypersonic flight.
Resumo:
Gauging data are available from numerous streams throughout Australia, and these data provide a basis for historical analysis of geomorphic change in stream channels in response to both natural phenomena and human activities. We present a simple method for analysis of these data, and a briefcase study of an application to channel change in the Tully River, in the humid tropics of north Queensland. The analysis suggests that this channel has narrowed and deepened, rather than aggraded: channel aggradation was expected, given the intensification of land use in the catchment, upstream of the gauging station. Limitations of the method relate to the time periods over which stream gauging occurred; the spatial patterns of stream gauging sites; the quality and consistency of data collection; and the availability of concurrent land-use histories on which to base the interpretation of the channel changes.
Resumo:
The tissue distribution kinetics of a highly bound solute, propranolol, was investigated in a heterogeneous organ, the isolated perfused limb, using the impulse-response technique and destructive sampling. The propranolol concentration in muscle, skin, and fat as well as in outflow perfusate was measured up to 30 min after injection. The resulting data were analysed assuming (1) vascular, muscle, skin and fat compartments as well mixed (compartmental model) and (2) using a distributed-in-space model which accounts for the noninstantaneous intravascular mixing and tissue distribution processes but consists only of a vascular and extravascular phase (two-phase model). The compartmental model adequately described propranolol concentration-time data in the three tissue compartments and the outflow concentration-time curve (except of the early mixing phase). In contrast, the two-phase model better described the outflow concentration-time curve but is limited in accounting only for the distribution kinetics in the dominant tissue, the muscle. The two-phase model well described the time course of propranolol concentration in muscle tissue, with parameter estimates similar to those obtained with the compartmental model. The results suggest, first that the uptake kinetics of propranolol into skin and fat cannot be analysed on the basis of outflow data alone and, second that the assumption of well-mixed compartments is a valid approximation from a practical point of view las, e.g., in physiological based pharmacokinetic modelling). The steady-state distribution volumes of skin and fat were only 16 and 4%, respectively, of that of muscle tissue (16.7 ml), with higher partition coefficient in fat (6.36) than in skin (2.64) and muscle (2.79. (C) 2000 Elsevier Science B.V. All rights reserved.
Resumo:
There is a widely held paradigm that mangroves are critical for sustaining production in coastal fisheries through their role as important nursery areas for fisheries species. This paradigm frequently forms the basis for important management decisions on habitat conservation and restoration of mangroves and other coastal wetlands. This paper reviews the current status of the paradigm and synthesises the information on the processes underlying these potential links. In the past, the paradigm has been supported by studies identifying correlations between the areal and linear extent of mangroves and fisheries catch. This paper goes beyond the correlative approach to develop a new framework on which future evaluations can be based. First, the review identifies what type of marine animals are using mangroves and at what life stages. These species can be categorised as estuarine residents, marine-estuarine species and marine stragglers. The marine-estuarine category includes many commercial species that use mangrove habitats as nurseries. The second stage is to determine why these species are using mangroves as nurseries. The three main proposals are that mangroves provide a refuge from predators, high levels of nutrients and shelter from physical disturbances. The recognition of the important attributes of mangrove nurseries then allows an evaluation of how changes in mangroves will affect the associated fauna. Surprisingly few studies have addressed this question. Consequently, it is difficult to predict how changes in any of these mangrove attributes would affect the faunal communities within them and, ultimately, influence the fisheries associated with them. From the information available, it seems likely that reductions in mangrove habitat complexity would reduce the biodiversity and abundance of the associated fauna, and these changes have the potential to cause cascading effects at higher trophic levels with possible consequences for fisheries. Finally, there is a discussion of the data that are currently available on mangrove distribution and fisheries catch, the limitations of these data and how best to use the data to understand mangrove-fisheries links and, ultimately, to optimise habitat and fisheries management. Examples are drawn from two relatively data-rich regions, Moreton Bay (Australia) and Western Peninsular Malaysia, to illustrate the data needs and research requirements for investigating the mangrove-fisheries paradigm. Having reliable and accurate data at appropriate spatial and temporal scales is crucial for mangrove-fisheries investigations. Recommendations are made for improvements to data collection methods that would meet these important criteria. This review provides a framework on which to base future investigations of mangrove-fisheries links, based on an understanding of the underlying processes and the need for rigorous data collection. Without this information, the understanding of the relationship between mangroves and fisheries will remain limited. Future investigations of mangrove-fisheries links must take this into account in order to have a good ecological basis and to provide better information and understanding to both fisheries and conservation managers.
Resumo:
The field of protein crystallography inspires and enthrals, whether it be for the beauty and symmetry of a perfectly formed protein crystal, the unlocked secrets of a novel protein fold, or the precise atomic-level detail yielded from a protein-ligand complex. Since 1958, when the first protein structure was solved, there have been tremendous advances in all aspects of protein crystallography, from protein preparation and crystallisation through to diffraction data measurement and structure refinement. These advances have significantly reduced the time required to solve protein crystal structures, while at the same time substantially improving the quality and resolution of the resulting structures. Moreover, the technological developments have induced researchers to tackle ever more complex systems, including ribosomes and intact membrane-bound proteins, with a reasonable expectation of success. In this review, the steps involved in determining a protein crystal structure are described and the impact of recent methodological advances identified. Protein crystal structures have proved to be extraordinarily useful in medicinal chemistry research, particularly with respect to inhibitor design. The precise interaction between a drug and its receptor can be visualised at the molecular level using protein crystal structures, and this information then used to improve the complementarity and thus increase the potency and selectivity of an inhibitor. The use of protein crystal structures in receptor-based drug design is highlighted by (i) HIV protease, (ii) influenza virus neuraminidase and (iii) prostaglandin H-2-synthetase. These represent, respectively, examples of protein crystal structures that (i) influenced the design of drugs currently approved for use in the treatment of HIV infection, (ii) led to the design of compounds currently in clinical trials for the treatment of influenza infection and (iii) could enable the design of highly specific non-steroidal anti-inflammatory drugs that lack the common side-effects of this drug class.
Resumo:
Dherte PM, Negrao MPG, Mori Neto S, Holzhacker R, Shimada V, Taberner P, Carmona MJC - Smart Alerts: Development of a Software to Optimize Data Monitoring. Background and objectives: Monitoring is useful for vital follow-ups and prevention, diagnosis, and treatment of several events in anesthesia. Although alarms can be useful in monitoring they can cause dangerous user`s desensitization. The objective of this study was to describe the development of specific software to integrate intraoperative monitoring parameters generating ""smart alerts"" that can help decision making, besides indicating possible diagnosis and treatment. Methods: A system that allowed flexibility in the definition of alerts, combining individual alarms of the parameters monitored to generate a more elaborated alert system was designed. After investigating a set of smart alerts, considered relevant in the surgical environment, a prototype was designed and evaluated, and additional suggestions were implemented in the final product. To verify the occurrence of smart alerts, the system underwent testing with data previously obtained during intraoperative monitoring of 64 patients. The system allows continuous analysis of monitored parameters, verifying the occurrence of smart alerts defined in the user interface. Results: With this system a potential 92% reduction in alarms was observed. We observed that in most situations that did not generate alerts individual alarms did not represent risk to the patient. Conclusions: Implementation of software can allow integration of the data monitored and generate information, such as possible diagnosis or interventions. An expressive potential reduction in the amount of alarms during surgery was observed. Information displayed by the system can be oftentimes more useful than analysis of isolated parameters.
Resumo:
Systems approaches can help to evaluate and improve the agronomic and economic viability of nitrogen application in the frequently water-limited environments. This requires a sound understanding of crop physiological processes and well tested simulation models. Thus, this experiment on spring wheat aimed to better quantify water x nitrogen effects on wheat by deriving some key crop physiological parameters that have proven useful in simulating crop growth. For spring wheat grown in Northern Australia under four levels of nitrogen (0 to 360 kg N ha(-1)) and either entirely on stored soil moisture or under full irrigation, kernel yields ranged from 343 to 719 g m(-2). Yield increases were strongly associated with increases in kernel number (9150-19950 kernels m(-2)), indicating the sensitivity of this parameter to water and N availability. Total water extraction under a rain shelter was 240 mm with a maximum extraction depth of 1.5 m. A substantial amount of mineral nitrogen available deep in the profile (below 0.9 m) was taken up by the crop. This was the source of nitrogen uptake observed after anthesis. Under dry conditions this late uptake accounted for approximately 50% of total nitrogen uptake and resulted in high (>2%) kernel nitrogen percentages even when no nitrogen was applied,Anthesis LAI values under sub-optimal water supply were reduced by 63% and under sub-optimal nitrogen supply by 50%. Radiation use efficiency (RUE) based on total incident short-wave radiation was 1.34 g MJ(-1) and did not differ among treatments. The conservative nature of RUE was the result of the crop reducing leaf area rather than leaf nitrogen content (which would have affected photosynthetic activity) under these moderate levels of nitrogen limitation. The transpiration efficiency coefficient was also conservative and averaged 4.7 Pa in the dry treatments. Kernel nitrogen percentage varied from 2.08 to 2.42%. The study provides a data set and a basis to consider ways to improve simulation capabilities of water and nitrogen effects on spring wheat. (C) 1997 Elsevier Science B.V.
Resumo:
Background: Studying stroke rates in a whole community is a rational way to assess the quality of patient care and primary prevention. However, there are few studies of trends in stroke rates worldwide and none in Brazil. Objective: Established study methods were used to define the rates for first ever stroke in a defined population in Brazil compared with similar data obtained and published in 1995. Methods: All stroke cases occurring in the city of Joinville during 2005-2006 were prospectively ascertained. Crude incidence and mortality rates were determined, and age adjusted rates and 30 day case fatality were calculated and compared with the 1995 data. Results: Of the 1323 stroke cases registered, 759 were first ever strokes. The incidence rate per 100 000 was 105.4 (95% CI 98.0 to 113.2), mortality rate was 23.9 (95% CI 20.4 to 27.8) and the 30 day case fatality was 19.1%. Compared with the 1995 data, we found that the incidence had decreased by 27%, mortality decreased by 37% and the 30 day case fatality decreased by 28%. Conclusions: Using defined criteria we showed that in an industrial southern Brazilian city, stroke rates are similar to those from developed countries. A significant decrease in stroke rates over the past decade was also found, suggesting an improvement in primary prevention and inpatient care of stroke patients in Joinville.
Resumo:
This study evaluated the use of Raman spectroscopy to identify the spectral differences between normal (N), benign hyperplasia (BPH) and adenocarcinoma (CaP) in fragments of prostate biopsies in vitro with the aim of developing a spectral diagnostic model for tissue classification. A dispersive Raman spectrometer was used with 830 nm wavelength and 80 mW excitation. Following Raman data collection and tissue histopathology (48 fragments diagnosed as N, 43 as BPH and 14 as CaP), two diagnostic models were developed in order to extract diagnostic information: the first using PCA and Mahalanobis analysis techniques and the second one a simplified biochemical model based on spectral features of cholesterol, collagen, smooth muscle cell and adipocyte. Spectral differences between N, BPH and CaP tissues, were observed mainly in the Raman bands associated with proteins, lipids, nucleic and amino acids. The PCA diagnostic model showed a sensitivity and specificity of 100%, which indicates the ability of PCA and Mahalanobis distance techniques to classify tissue changes in vitro. Also, it was found that the relative amount of collagen decreased while the amount of cholesterol and adipocyte increased with severity of the disease. Smooth muscle cell increased in BPH tissue. These characteristics were used for diagnostic purposes.
Resumo:
Study Design. In vitro biomechanical investigation of the screw-holding capacity. Objective. To evaluate the effect of repetitive screw-hole use on the insertional torque and retentive strength of vertebral system screws. Summary and Background Data. Placement and removal of vertebral system screws is sometimes necessary during the surgical procedures in order to assess the walls of the pilot hole. This procedure may compromise the holding capacity of the implant. Methods. Screws with outer diameter measuring 5, 6, and 7 mm were inserted into wood, polyurethane, polyethylene, and cancellous bone cylindrical blocks. The pilot holes were made with drills of a smaller, equal, or wider diameter than the inner screw diameter. Three experimental groups were established based on the number of insertions and reinsertions of the screws and subgroups were created according to the outer diameter of the screw and the diameter of the pilot hole used. Results. A reduction of screw-holding capacity was observed between the first and the following insertions regardless the anchorage material. The pattern of reduction of retentive strength was not similar to the pattern of torque reduction. The pullout strength was more pronounced between the first and the last insertions, while the torque decreased more proportionally from the first to the last insertions. Conclusion. Insertion and reinsertion of the screws of the vertebral fixation system used in the present study reduced the insertion torque and screw purchase.
Resumo:
Studies on children with cancer have suggested that energy expenditure may indeed be greater than predicted for healthy children. Nutritional assessment is important for intervention and for the prevention of complications associated with malnutrition. The present study aimed to describe the nutritional status, energy expenditure, and substrate utilization of children and adolescents with cancer compared to healthy children matched for age, sex, and body mass index. Subjects were evaluated by anthropometry, food intake pattern, and body composition analysis. Energy expenditure and substrate oxidation were measured by indirect calorimetry. Indirect calorimetry data, energy, and macronutrient intake, anthropometry, and body composition parameters showed no significant differences between groups. There was no evidence of increased energy expenditure or of a change in substrate utilization in children with cancer compared to the healthy group. The data regarding usual food consumption showed no significant differences between groups.
Resumo:
Epidemiological studies report confidence or uncertainty intervals around their estimates. Estimates of the burden of diseases and risk factors are subject to a broader range of uncertainty because of the combination of multiple data sources and value choices. Sensitivity analysis can be used to examine the effects of social values that have been incorporated into the design of the disability–adjusted life year (DALY). Age weight, where a year of healthy life lived at one age is valued differently from at another age, is the most controversial value built into the DALY. The discount rate, which addresses the difference in value of current versus future health benefits, also has been criticized. The distribution of the global disease burden and rankings of various conditions are largely insensitive to alternate assumptions about the discount rate and age weighting. The major effects of discounting and age weighting are to enhance the importance of neuropsychiatric conditions and sexually transmitted infections. The Global Burden of Disease study also has been criticized for estimating mortality and disease burden for regions using incomplete and uncertain data. Including uncertain results, with uncertainty quantified to the extent possible, is preferable, however, to leaving blank cells in tables intended to provide policy makers with an overall assessment of burden of disease. No estimate is generally interpreted as no problem. Greater investment in getting the descriptive epidemiology of diseases and injuries correct in poor countries will do vastly more to reduce uncertainty in disease burden assessments than a philosophical debate about the appropriateness of social value