498 resultados para Eclipse, SODA


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Post-deployment maintenance and evolution can account for up to 75% of the cost of developing a software system. Software refactoring can reduce the costs associated with evolution by improving system quality. Although refactoring can yield benefits, the process includes potentially complex, error-prone, tedious and time-consuming tasks. It is these tasks that automated refactoring tools seek to address. However, although the refactoring process is well-defined, current refactoring tools do not support the full process. To develop better automated refactoring support, we have completed a usability study of software refactoring tools. In the study, we analysed the task of software refactoring using the ISO 9241-11 usability standard and Fitts' List of task allocation. Expanding on this analysis, we reviewed 11 collections of usability guidelines and combined these into a single list of 38 guidelines. From this list, we developed 81 usability requirements for refactoring tools. Using these requirements, the software refactoring tools Eclipse 3.2, Condenser 1.05, RefactorIT 2.5.1, and Eclipse 3.2 with the Simian UI 2.2.12 plugin were studied. Based on the analysis, we have selected a subset of the requirements that can be incorporated into a prototype refactoring tool intended to address the full refactoring process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the “gold standard” for predicting dose deposition in the patient. In this study, software has been developed that enables the transfer of treatment plan information from the treatment planning system to a Monte Carlo dose calculation engine. A database of commissioned linear accelerator models (Elekta Precise and Varian 2100CD at various energies) has been developed using the EGSnrc/BEAMnrc Monte Carlo suite. Planned beam descriptions and CT images can be exported from the treatment planning system using the DICOM framework. The information in these files is combined with an appropriate linear accelerator model to allow the accurate calculation of the radiation field incident on a modelled patient geometry. The Monte Carlo dose calculation results are combined according to the monitor units specified in the exported plan. The result is a 3D dose distribution that could be used to verify treatment planning system calculations. The software, MCDTK (Monte Carlo Dicom ToolKit), has been developed in the Java programming language and produces BEAMnrc and DOSXYZnrc input files, ready for submission on a high-performance computing cluster. The code has been tested with the Eclipse (Varian Medical Systems), Oncentra MasterPlan (Nucletron B.V.) and Pinnacle3 (Philips Medical Systems) planning systems. In this study the software was validated against measurements in homogenous and heterogeneous phantoms. Monte Carlo models are commissioned through comparison with quality assurance measurements made using a large square field incident on a homogenous volume of water. This study aims to provide a valuable confirmation that Monte Carlo calculations match experimental measurements for complex fields and heterogeneous media.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo (MC) methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the ‘gold standard’ for predicting dose deposition in the patient [1]. This project has three main aims: 1. To develop tools that enable the transfer of treatment plan information from the treatment planning system (TPS) to a MC dose calculation engine. 2. To develop tools for comparing the 3D dose distributions calculated by the TPS and the MC dose engine. 3. To investigate the radiobiological significance of any errors between the TPS patient dose distribution and the MC dose distribution in terms of Tumour Control Probability (TCP) and Normal Tissue Complication Probabilities (NTCP). The work presented here addresses the first two aims. Methods: (1a) Plan Importing: A database of commissioned accelerator models (Elekta Precise and Varian 2100CD) has been developed for treatment simulations in the MC system (EGSnrc/BEAMnrc). Beam descriptions can be exported from the TPS using the widespread DICOM framework, and the resultant files are parsed with the assistance of a software library (PixelMed Java DICOM Toolkit). The information in these files (such as the monitor units, the jaw positions and gantry orientation) is used to construct a plan-specific accelerator model which allows an accurate simulation of the patient treatment field. (1b) Dose Simulation: The calculation of a dose distribution requires patient CT images which are prepared for the MC simulation using a tool (CTCREATE) packaged with the system. Beam simulation results are converted to absolute dose per- MU using calibration factors recorded during the commissioning process and treatment simulation. These distributions are combined according to the MU meter settings stored in the exported plan to produce an accurate description of the prescribed dose to the patient. (2) Dose Comparison: TPS dose calculations can be obtained using either a DICOM export or by direct retrieval of binary dose files from the file system. Dose difference, gamma evaluation and normalised dose difference algorithms [2] were employed for the comparison of the TPS dose distribution and the MC dose distribution. These implementations are spatial resolution independent and able to interpolate for comparisons. Results and Discussion: The tools successfully produced Monte Carlo input files for a variety of plans exported from the Eclipse (Varian Medical Systems) and Pinnacle (Philips Medical Systems) planning systems: ranging in complexity from a single uniform square field to a five-field step and shoot IMRT treatment. The simulation of collimated beams has been verified geometrically, and validation of dose distributions in a simple body phantom (QUASAR) will follow. The developed dose comparison algorithms have also been tested with controlled dose distribution changes. Conclusion: The capability of the developed code to independently process treatment plans has been demonstrated. A number of limitations exist: only static fields are currently supported (dynamic wedges and dynamic IMRT will require further development), and the process has not been tested for planning systems other than Eclipse and Pinnacle. The tools will be used to independently assess the accuracy of the current treatment planning system dose calculation algorithms for complex treatment deliveries such as IMRT in treatment sites where patient inhomogeneities are expected to be significant. Acknowledgements: Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia. Pinnacle dose parsing made possible with the help of Paul Reich, North Coast Cancer Institute, North Coast, New South Wales.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Designing the smart grid requires combining varied models. As their number increases, so does the complexity of the software. Having a well thought architecture for the software then becomes crucial. This paper presents MODAM, a framework designed to combine agent-based models in a flexible and extensible manner, using well known software engineering design solutions (OSGi specification [1] and Eclipse plugins [2]). Details on how to build a modular agent-based model for the smart grid are given in this paper, illustrated by an example for a small network.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Agent-based modelling (ABM), like other modelling techniques, is used to answer specific questions from real world systems that could otherwise be expensive or impractical. Its recent gain in popularity can be attributed to some degree to its capacity to use information at a fine level of detail of the system, both geographically and temporally, and generate information at a higher level, where emerging patterns can be observed. This technique is data-intensive, as explicit data at a fine level of detail is used and it is computer-intensive as many interactions between agents, which can learn and have a goal, are required. With the growing availability of data and the increase in computer power, these concerns are however fading. Nonetheless, being able to update or extend the model as more information becomes available can become problematic, because of the tight coupling of the agents and their dependence on the data, especially when modelling very large systems. One large system to which ABM is currently applied is the electricity distribution where thousands of agents representing the network and the consumers’ behaviours are interacting with one another. A framework that aims at answering a range of questions regarding the potential evolution of the grid has been developed and is presented here. It uses agent-based modelling to represent the engineering infrastructure of the distribution network and has been built with flexibility and extensibility in mind. What distinguishes the method presented here from the usual ABMs is that this ABM has been developed in a compositional manner. This encompasses not only the software tool, which core is named MODAM (MODular Agent-based Model) but the model itself. Using such approach enables the model to be extended as more information becomes available or modified as the electricity system evolves, leading to an adaptable model. Two well-known modularity principles in the software engineering domain are information hiding and separation of concerns. These principles were used to develop the agent-based model on top of OSGi and Eclipse plugins which have good support for modularity. Information regarding the model entities was separated into a) assets which describe the entities’ physical characteristics, and b) agents which describe their behaviour according to their goal and previous learning experiences. This approach diverges from the traditional approach where both aspects are often conflated. It has many advantages in terms of reusability of one or the other aspect for different purposes as well as composability when building simulations. For example, the way an asset is used on a network can greatly vary while its physical characteristics are the same – this is the case for two identical battery systems which usage will vary depending on the purpose of their installation. While any battery can be described by its physical properties (e.g. capacity, lifetime, and depth of discharge), its behaviour will vary depending on who is using it and what their aim is. The model is populated using data describing both aspects (physical characteristics and behaviour) and can be updated as required depending on what simulation is to be run. For example, data can be used to describe the environment to which the agents respond to – e.g. weather for solar panels, or to describe the assets and their relation to one another – e.g. the network assets. Finally, when running a simulation, MODAM calls on its module manager that coordinates the different plugins, automates the creation of the assets and agents using factories, and schedules their execution which can be done sequentially or in parallel for faster execution. Building agent-based models in this way has proven fast when adding new complex behaviours, as well as new types of assets. Simulations have been run to understand the potential impact of changes on the network in terms of assets (e.g. installation of decentralised generators) or behaviours (e.g. response to different management aims). While this platform has been developed within the context of a project focussing on the electricity domain, the core of the software, MODAM, can be extended to other domains such as transport which is part of future work with the addition of electric vehicles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a total solar eclipse, the Moon completely covers the Sun, casting a shadow several hundred km wide across the face of the Earth. This paper describes observations of the 14 November 2012 total eclipse of the Sun visible from north Queensland, Australia. The edge of the umbra was captured on video during totality, and this video is provided for teaching purposes. A series of simple 'kitchen' experiments are described which demonstrate the 'sunset' effect seen on the horizon during a total solar eclipse and also the curved umbra seen in the sky when the eclipsed Sun is relatively close to the horizon.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Combining datasets across independent studies can boost statistical power by increasing the numbers of observations and can achieve more accurate estimates of effect sizes. This is especially important for genetic studies where a large number of observations are required to obtain sufficient power to detect and replicate genetic effects. There is a need to develop and evaluate methods for joint-analytical analyses of rich datasets collected in imaging genetics studies. The ENIGMA-DTI consortium is developing and evaluating approaches for obtaining pooled estimates of heritability through meta-and mega-genetic analytical approaches, to estimate the general additive genetic contributions to the intersubject variance in fractional anisotropy (FA) measured from diffusion tensor imaging (DTI). We used the ENIGMA-DTI data harmonization protocol for uniform processing of DTI data from multiple sites. We evaluated this protocol in five family-based cohorts providing data from a total of 2248 children and adults (ages: 9-85) collected with various imaging protocols. We used the imaging genetics analysis tool, SOLAR-Eclipse, to combine twin and family data from Dutch, Australian and Mexican-American cohorts into one large "mega-family". We showed that heritability estimates may vary from one cohort to another. We used two meta-analytical (the sample-size and standard-error weighted) approaches and a mega-genetic analysis to calculate heritability estimates across-population. We performed leave-one-out analysis of the joint estimates of heritability, removing a different cohort each time to understand the estimate variability. Overall, meta- and mega-genetic analyses of heritability produced robust estimates of heritability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sugarcane bagasse pretreatment processes using acidified aqueous ethylene glycol (EG) and ionic liquids (ILs) have been reported recently. In this study, recovery of lignins from these processes was conducted, as well as determination of their physico-chemical properties. The amount of lignins recovered from 1-butyl-3-methylimidazolium chloride ([bmim]Cl) with HCl as a catalyst and [bmim][CH3SO3] was ∼42%, and ∼35%–36% by EG with HCl or H2SO4 as a catalyst, respectively. The isolated lignins were characterised using wet chemistry, spectroscopy and thermogravimetry analysis (TGA), and the results compared to soda lignin from NaOH pretreatment of bagasse. The IL and EG lignins contained no or trace amounts of carbohydrates, slightly lower hydrogen content but slightly higher oxygen contents than soda lignin. The IL and EG lignins contained more C-3 and C-5 reactive sites for Mannich reaction and had more p-hydroxypheny propane unit structures than soda lignin. Two-dimensional heteronuclear single quantum coherence (2D HSQC) nuclear magnetic resonance (NMR) identified the major substructural units in the lignins, and allowed differences among them to be studied. As EG lignins were extracted in very reactive environment, intermediate enol ethers were formed and led to cleavage reactions which were not apparent in the other lignins. 31P NMR and infra-red spectroscopy results showed that IL and EG lignins had lower total hydroxyl content than soda lignin, probably indicating that a higher degree of self-polymerisation occurred during bagasse pretreatment, despite the use of lower temperature and shorter reaction time. On the basis of the salient features of these lignins, potential applications were proposed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A technique is described for calculating the brightness of the atmosphere of the Earth that shines into the Earth’s umbra during a total lunar eclipse making the Moon red. This ‘Rim of Fire’ is due to refracted un scattered light from all the sunrises and sunsets rimming the Earth. In this article, a photograph of the totally eclipsed Moon was compared with the Full Moon and the difference in brightness calculated taking into account the exposure time and ISO setting. The results show that the Full Moon is over 14 000 times brighter than the totally eclipsed Moon. The relative brightness of the eclipsed Moon can be used to estimate that the luminance of Rim of Fire is over 12 trillion watts. The experiment described in this paper would be suitable as a high school or university exercise.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Curcumin has gained immense importance for its vast therapeutic and prophylactic applications. Contrary to this, our study reveals that it regulates the defense pathways of Salmonella enterica serovar Typhimurium ( S. Typhimurium) to enhance its pathogenicity. In a murine model of typhoid fever, we observed higher bacterial load in Peyer's,patches, mesenteric lymph node, spleen and liver, when infected with curcumin-treated Salmonella. Curcumin increased the resistance of S. Typhimurium against antimicrobial agents like antimicrobial peptides, reactive oxygen and nitrogen species. This increased tolerance might be attributed to the up-regulation of genes involved in resistance against antimicrobial peptides - pmrD and pmrHFIJKLM and genes with antioxidant function - mntH, sodA and sitA. We implicate that iron chelation property of curcumin have a role in regulating mntH and sitA. Interestingly, we see that the curcumin-mediated modulation of pmr genes is through the PhoPQ regulatory system. Curcumin downregulates SPI1 genes, required for entry into epithelial cells and upregulates SPI2 genes required to intracellular survival. Since it is known that the SPI1 and SPI2 system can be regulated by the PhoPQ system, this common regulator could explain curcumin's mode of action. This data urges us to rethink the indiscriminate use of curcumin especially during Salmonella outbreaks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Faraday rotation data obtained at Delhi, Kurukshetra, Hyderabad, Bangalore, Waltair, Nagpur and Calcutta during the total solar eclipse of 16 February 1980 and at Delhi during the total solar eclipse of 31 July 1981 have been analysed to detect the gravity waves generated by a total solar eclipse as hypothesized by Chimonas and Hines (1970, J. geophys. Res. 75, 875). It has been found that gravity waves can be generated by a total solar eclipse but their detection at ionospheric heights is critically dependent on the location of the observing station in relation to the eclipse path geometry. The distance of the observing station from the eclipse path should be more than 500 km in order to detect such gravity waves.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The outer atmosphere of the sun called the corona has been observed during total solar eclipse for short periods (typically <6 min), from as early as the eighteenth century. In the recent past, space-based instruments have permitted us to study the corona uninterruptedly. In spite of these developments, the dynamic corona and its high temperature (1-2 million K) are yet to be Ally understood. It is conjectured that their dynamic nature and associated energetic events are possible reasons behind the high temperature. In order to study these in detail, a visible emission line space solar coronagraph is being proposed as a payload under the small-satellite programme of the Indian Space Research Organisation. The satellite is named as Aditya-1 and the scientific objectives of this payload are to study: (i) the existence of intensity oscillations for the study of wave-driven coronal heating; (ii) the dynamics and formation of coronal loops and temperature structure of the coronal features; (iii) the origin, cause and acceleration of coronal mass ejections (CMEs) and other solar active features, and (iv) coronal magnetic field topology and three-dimensional structures of CMEs using polarization information. The uniqueness of this payload compared to previously flown space instruments is as follows: (a) observations in the visible wavelength closer to the disk (down to 1.05 solar radii); (b) high time cadence capability (better than two-images per second), and (c) simultaneous observations of at least two spectral windows all the time and three spectral windows for short durations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cu2ZnSnS4 (CZTS) is a kesterite semiconductor consisting of abundantly available elements. It has a band gap of 1.5 eV and a large absorption coefficient. Hence, thin films made of this material can be used as absorber layers of a solar cell. CZTS films were deposited on soda lime and Na free borosilicate glass substrates through Ultrasonic Spray Pyrolysis. The diffusion of sodium from soda lime glass was found to have a profound effect on characteristics like grain size, crystal texture and conductivity of CZTS thin films. Copper ion concentration also varied during the deposition and it was observed that the carrier concentration was enhanced when there was a deficiency of copper in the films. The effect of sodium diffusion and copper deficiency in enhancing the structural and electrical properties of CZTS films are presented in this paper. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

CZTS (Copper Zinc Tin Sulphide) is a wide band gap quartnery chalcopyrite which has a band gap of about 1.45 eV and an absorption coefficient of 10(4) cm(-1); thus making it an ideal material to be used as an absorber layer in solar cells. Ultrasonic Spray Pyrolysis is a deposition technique, where the solution is atomized ultrasonically, thereby giving a fine mist having a narrow size distribution which can be used for uniform coatings on substrates. An Ultrasonic Spray Pyrolysis equipment was developed and CZTS absorber layers were successfully grown with this technique on soda lime glass substrates using aqueous solutions. Substrate temperatures ranging from 523 K to 723 K were used to deposit the CZTS layers and these films were characterized using SEM, EDAX and XRD. It was observed that the film crystallized in the kesterite structure and the best crystallites were obtained at 613 K. It was observed that the grain size progressively increased with temperature. The optical band gap of the material was obtained as 1.54 eV.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In computational molecular biology, the aim of restriction mapping is to locate the restriction sites of a given enzyme on a DNA molecule. Double digest and partial digest are two well-studied techniques for restriction mapping. While double digest is NP-complete, there is no known polynomial-time algorithm for partial digest. Another disadvantage of the above techniques is that there can be multiple solutions for reconstruction. In this paper, we study a simple technique called labeled partial digest for restriction mapping. We give a fast polynomial time (O(n(2) log n) worst-case) algorithm for finding all the n sites of a DNA molecule using this technique. An important advantage of the algorithm is the unique reconstruction of the DNA molecule from the digest. The technique is also robust in handling errors in fragment lengths which arises in the laboratory. We give a robust O(n(4)) worst-case algorithm that can provably tolerate an absolute error of O(Delta/n) (where Delta is the minimum inter-site distance), while giving a unique reconstruction. We test our theoretical results by simulating the performance of the algorithm on a real DNA molecule. Motivated by the similarity to the labeled partial digest problem, we address a related problem of interest-the de novo peptide sequencing problem (ACM-SIAM Symposium on Discrete Algorithms (SODA), 2000, pp. 389-398), which arises in the reconstruction of the peptide sequence of a protein molecule. We give a simple and efficient algorithm for the problem without using dynamic programming. The algorithm runs in time O(k log k), where k is the number of ions and is an improvement over the algorithm in Chen et al. (C) 2002 Elsevier Science (USA). All rights reserved.