944 resultados para Free Software
Resumo:
This report describes the available functionality and use of the ClusterEval evaluation software. It implements novel and standard measures for the evaluation of cluster quality. This software has been used at the INEX XML Mining track and in the MediaEval Social Event Detection task.
Resumo:
This paper describes the content and delivery of a software internationalisation subject (ITN677) that was developed for Master of Information Technology (MIT) students in the Faculty of Information Technology at Queensland University of Technology. This elective subject introduces students to the strategies, technologies, techniques and current development associated with this growing 'software development for the world' specialty area. Students learn what is involved in planning and managing a software internationalisation project as well as designing, building and using a software internationalisation application. Students also learn about how a software internationalisation project must fit into an over-all product localisation and globalisation that may include culturalisation, tailored system architectures, and reliance upon industry standards. In addition, students are exposed to the different software development techniques used by organizations in this arena and the perils and pitfalls of managing software internationalisation projects.
Resumo:
The objective of this PhD research program is to investigate numerical methods for simulating variably-saturated flow and sea water intrusion in coastal aquifers in a high-performance computing environment. The work is divided into three overlapping tasks: to develop an accurate and stable finite volume discretisation and numerical solution strategy for the variably-saturated flow and salt transport equations; to implement the chosen approach in a high performance computing environment that may have multiple GPUs or CPU cores; and to verify and test the implementation. The geological description of aquifers is often complex, with porous materials possessing highly variable properties, that are best described using unstructured meshes. The finite volume method is a popular method for the solution of the conservation laws that describe sea water intrusion, and is well-suited to unstructured meshes. In this work we apply a control volume-finite element (CV-FE) method to an extension of a recently proposed formulation (Kees and Miller, 2002) for variably saturated groundwater flow. The CV-FE method evaluates fluxes at points where material properties and gradients in pressure and concentration are consistently defined, making it both suitable for heterogeneous media and mass conservative. Using the method of lines, the CV-FE discretisation gives a set of differential algebraic equations (DAEs) amenable to solution using higher-order implicit solvers. Heterogeneous computer systems that use a combination of computational hardware such as CPUs and GPUs, are attractive for scientific computing due to the potential advantages offered by GPUs for accelerating data-parallel operations. We present a C++ library that implements data-parallel methods on both CPU and GPUs. The finite volume discretisation is expressed in terms of these data-parallel operations, which gives an efficient implementation of the nonlinear residual function. This makes the implicit solution of the DAE system possible on the GPU, because the inexact Newton-Krylov method used by the implicit time stepping scheme can approximate the action of a matrix on a vector using residual evaluations. We also propose preconditioning strategies that are amenable to GPU implementation, so that all computationally-intensive aspects of the implicit time stepping scheme are implemented on the GPU. Results are presented that demonstrate the efficiency and accuracy of the proposed numeric methods and formulation. The formulation offers excellent conservation of mass, and higher-order temporal integration increases both numeric efficiency and accuracy of the solutions. Flux limiting produces accurate, oscillation-free solutions on coarse meshes, where much finer meshes are required to obtain solutions with equivalent accuracy using upstream weighting. The computational efficiency of the software is investigated using CPUs and GPUs on a high-performance workstation. The GPU version offers considerable speedup over the CPU version, with one GPU giving speedup factor of 3 over the eight-core CPU implementation.
Resumo:
In this paper we analyse the effects of highway traffic flow parameters like vehicle arrival rate and density on the performance of Amplify and Forward (AF) cooperative vehicular networks along a multi-lane highway under free flow state. We derive analytical expressions for connectivity performance and verify them with Monte-Carlo simulations. When AF cooperative relaying is employed together with Maximum Ratio Combining (MRC) at the receivers the average route error rate shows 10-20 fold improvement compared to direct communication. A 4-8 fold increase in maximum number of traversable hops can also be observed at different vehicle densities when AF cooperative communication is used to strengthen communication routes. However the theorical upper bound of maximum number of hops promises higher performance gains.
Resumo:
The papers in this issue focus our attention to packaged software as an increasingly important, but still relatively poorly understood phenomena in the information systems research community. The topic is not new: Lucas et al. (1988) wrote a provocative piece focused on the issues with implementing packaged software. A decade later, Carmel (1997) argued that packaged software was both ideally suited for American entrepreneurial activity and rapidly growing. The information systems research community, however, has moved more slowly to engage this change (e.g., Sawyer, 2001). The papers in this special issue represent a significant step in better engaging the issues of packaged software relative to information systems research, and highlighting opportunities for additional relevant research.
Resumo:
Enterprise resource planning (ERP) software is a dominant approach for dealing with legacy information system problems. In order to avoid invalidating maintenance and development support from the ERP vendor, most organizations reengineer their business processes in line with those implicit within the software. Regardless, some customization is typically required. This paper presents two case studies of ERP projects where customizations have been performed. The case analysis suggests that while customizations can give true organizational benefits, careful consideration is required to determine whether a customization is viable given its potential impact upon future maintenance. Copyright © 2001 John Wiley & Sons, Ltd.
Resumo:
Invasion waves of cells play an important role in development, disease and repair. Standard discrete models of such processes typically involve simulating cell motility, cell proliferation and cell-to-cell crowding effects in a lattice-based framework. The continuum-limit description is often given by a reaction–diffusion equation that is related to the Fisher–Kolmogorov equation. One of the limitations of a standard lattice-based approach is that real cells move and proliferate in continuous space and are not restricted to a predefined lattice structure. We present a lattice-free model of cell motility and proliferation, with cell-to-cell crowding effects, and we use the model to replicate invasion wave-type behaviour. The continuum-limit description of the discrete model is a reaction–diffusion equation with a proliferation term that is different from lattice-based models. Comparing lattice based and lattice-free simulations indicates that both models lead to invasion fronts that are similar at the leading edge, where the cell density is low. Conversely, the two models make different predictions in the high density region of the domain, well behind the leading edge. We analyse the continuum-limit description of the lattice based and lattice-free models to show that both give rise to invasion wave type solutions that move with the same speed but have very different shapes. We explore the significance of these differences by calibrating the parameters in the standard Fisher–Kolmogorov equation using data from the lattice-free model. We conclude that estimating parameters using this kind of standard procedure can produce misleading results.
Resumo:
Modernized GPS and GLONASS, together with new GNSS systems, BeiDou and Galileo, offer code and phase ranging signals in three or more carriers. Traditionally, dual-frequency code and/or phase GPS measurements are linearly combined to eliminate effects of ionosphere delays in various positioning and analysis. This typical treatment method has imitations in processing signals at three or more frequencies from more than one system and can be hardly adapted itself to cope with the booming of various receivers with a broad variety of singles. In this contribution, a generalized-positioning model that the navigation system independent and the carrier number unrelated is promoted, which is suitable for both single- and multi-sites data processing. For the synchronization of different signals, uncalibrated signal delays (USD) are more generally defined to compensate the signal specific offsets in code and phase signals respectively. In addition, the ionospheric delays are included in the parameterization with an elaborate consideration. Based on the analysis of the algebraic structures, this generalized-positioning model is further refined with a set of proper constrains to regularize the datum deficiency of the observation equation system. With this new model, uncalibrated signal delays (USD) and ionospheric delays are derived for both GPS and BeiDou with a large dada set. Numerical results demonstrate that, with a limited number of stations, the uncalibrated code delays (UCD) are determinate to a precision of about 0.1 ns for GPS and 0.4 ns for BeiDou signals, while the uncalibrated phase delays (UPD) for L1 and L2 are generated with 37 stations evenly distributed in China for GPS with a consistency of about 0.3 cycle. Extra experiments concerning the performance of this novel model in point positioning with mixed-frequencies of mixed-constellations is analyzed, in which the USD parameters are fixed with our generated values. The results are evaluated in terms of both positioning accuracy and convergence time.
Resumo:
The Hepatitis C virus (HCV) affects some 150 million people worldwide. However, unlike hepatitis A and B there is no vaccination for HCV and approximately 75% of people exposed to HCV develop chronic hepatitis. In Australia, around 226,700 people live with chronic HCV infection costing the government approximately $252 million per year. Historically, the standard approved/licenced treatment for HCV is pegylated interferon with ribavirin. There are major drawbacks with interferon-based therapy including side effects, long duration of therapy, limited access and affordability. Our previous survey of an at-risk population reported HCV treatment coverage of only 5%. Since April 2013, a new class of interferon-free treatments for chronic HCV is subsidised under the Pharmaceutical Benefits Scheme: boceprevir and telaprevir - estimated to cost the Australian Government in excess of $220 million over five years. Other biologic interferon-free therapeutic agents are scheduled to enter the Australian market. Use of small molecule generic pharmaceuticals has been advocated as a means of public cost savings. However, with the new biologic agents, generics (biosimilars) may not be feasible or straightforward, due to long patent life; marketing exclusivity; and regulatory complexity for these newer products.
Resumo:
Packaged software is pre-built with the intention of licensing it to users in domestic settings and work organisations. This thesis focuses upon the work organisation where packaged software has been characterised as one of the latest ‘solutions’ to the problems of information systems. The study investigates the packaged software selection process that has, to date, been largely viewed as objective and rational. In contrast, this interpretive study is based on a 21⁄2 year long field study of organisational experiences with packaged software selection at T.Co, a consultancy organisation based in the United Kingdom. Emerging from the iterative process of case study and action research is an alternative theory of packaged software selection. The research argues that packaged software selection is far from the rationalistic and linear process that previous studies suggest. Instead, the study finds that aspects of the traditional process of selection incorporating the activities of gathering requirements, evaluation and selection based on ‘best fit’ may or may not take place. Furthermore, even where these aspects occur they may not have equal weight or impact upon implementation and usage as may be expected. This is due to the influence of those multiple realities which originate from the organisational and market environments within which packages are created, selected and used, the lack of homogeneity in organisational contexts and the variously interpreted characteristics of the package in question.
Resumo:
Numerically investigation of free convection heat transfer in a differentially heated trapezoidal cavity filled with non-Newtonian Power-law fluid has been performed in this study. The left inclined surface is uniformly heated whereas the right inclined surface is maintained as uniformly cooled. The top and bottom surfaces are kept adiabatic with initially quiescent fluid inside the enclosure. Finite volume based commercial software FLUENT 14.5 is used to solve the governing equations. Dependency of various flow parameters of fluid flow and heat transfer is analyzed including Rayleigh number, Ra ranging from 10^5 to 10^7, Prandtl number, Pr of 100 to 10,000 and power index, n of 0.6 to 1.4. Outcomes have been reported in terms of isotherms, streamline, and local Nusselt number for various Ra, Pr, n and inclined angles. Grid sensitivity analysis is performed and numerically obtained results have been compared with those results available in the literature and found good agreement.
Resumo:
In this paper, a framework for isolating unprecedented faults for an EGR valve system is presented. Using normal behavior data generated by a high fidelity engine simulation, the recently introduced Growing Structure Multiple Model System (GSMMS) is used to construct models of normal behavior for an EGR valve system and its various subsystems. Using the GSMMS models as a foundation, anomalous behavior of the entire system is then detected as statistically significant departures of the most recent modeling residuals from the modeling residuals during normal behavior. By reconnecting anomaly detectors to the constituent subsystems, the anomaly can be isolated without the need for prior training using faulty data. Furthermore, faults that were previously encountered (and modeled) are recognized using the same approach as the anomaly detectors.
Resumo:
In this paper, a recently introduced model-based method for precedent-free fault detection and isolation (FDI) is modified to deal with multiple input, multiple output (MIMO) systems and is applied to an automotive engine with exhaust gas recirculation (EGR) system. Using normal behavior data generated by a high fidelity engine simulation, the growing structure multiple model system (GSMMS) approach is used to construct dynamic models of normal behavior for the EGR system and its constituent subsystems. Using the GSMMS models as a foundation, anomalous behavior is detected whenever statistically significant departures of the most recent modeling residuals away from the modeling residuals displayed during normal behavior are observed. By reconnecting the anomaly detectors (ADs) to the constituent subsystems, EGR valve, cooler, and valve controller faults are isolated without the need for prior training using data corresponding to particular faulty system behaviors.
Resumo:
Organ motion as a result of respiration is an important field of research for medical physics. Knowledge of magnitude and direction of this motion is necessary to allow for more accurate radiotherapy treatment planning. This will result in higher doses to the tumour whilst sparing healthy tissue. This project involved human trials, where the radiation therapy patient's kidneys were CT scanned under three different conditions; whilst free breathing (FB), breath-hold at normal tidal inspiration (BHIN), and breath-hold at normal tidal expiration (BHEX). The magnitude of motion was measured by recording the outline of the kidney from a Beam's Eye View (BEV). The centre of mass of this 2D shape was calculated for each set using "ImageJ" software and the magnitude of movement determined from the change in the centroid's coordinates between the BHIN and BHEX scans. The movement ranged from, for the left and right kidneys, 4-46mm and 2-44mm in the superior/inferior (axial) plane, 1-21mm and 2- 16mm in the anterior/posterior (coronal) plane, and 0-6mm and 0-8mm in the lateral/medial (sagittal) plane. From exhale to inhale, the kidneys tended to move inferiorly, anteriorly and laterally. A standard radiotherapy plan, designed to treat the para-aortics with opposed lateral fields was performed on the free breathing (planning) CT set. The field size and arrangement was set up using the same parameters for each subject. The prescription was to deliver 45 Gray in 25 fractions. This field arrangement and prescription was then copied over to the breath hold CT sets, and the dosimetric differences were compared using Dose Volume Histograms (DVH). The point of comparison for the three sets was recorded as the percentage volume of kidney receiving less than or equal to 10 Gray. The QUASAR respiratory motion phantom was used with the range of motion determined from the human study. The phantom was imaged, planned and treated with a linear accelerator with dose determined by film. The effect of the motion was measured by the change in the penumbra of the film and compared to the penumbra from the treatment planning system.
Resumo:
Conservation of free-ranging cheetah (Acinonyx jubatus) populations is multi faceted and needs to be addressed from an ecological, biological and management perspective. There is a wealth of published research, each focusing on a particular aspect of cheetah conservation. Identifying the most important factors, making sense of various (and sometimes contrasting) findings, and taking decisions when little or no empirical data is available, are everyday challenges facing conservationists. Bayesian networks (BN) provide a statistical modeling framework that enables analysis and integration of information addressing different aspects of conservation. There has been an increased interest in the use of BNs to model conservation issues, however the development of more sophisticated BNs, utilizing object-oriented (OO) features, is still at the frontier of ecological research. We describe an integrated, parallel modeling process followed during a BN modeling workshop held in Namibia to combine expert knowledge and data about free-ranging cheetahs. The aim of the workshop was to obtain a more comprehensive view of the current viability of the free-ranging cheetah population in Namibia, and to predict the effect different scenarios may have on the future viability of this free-ranging cheetah population. Furthermore, a complementary aim was to identify influential parameters of the model to more effectively target those parameters having the greatest impact on population viability. The BN was developed by aggregating diverse perspectives from local and independent scientists, agents from the national ministry, conservation agency members and local fieldworkers. This integrated BN approach facilitates OO modeling in a multi-expert context which lends itself to a series of integrated, yet independent, subnetworks describing different scientific and management components. We created three subnetworks in parallel: a biological, ecological and human factors network, which were then combined to create a complete representation of free-ranging cheetah population viability. Such OOBNs have widespread relevance to the effective and targeted conservation management of vulnerable and endangered species.