800 resultados para software development
Resumo:
Irradiance profile around the receiver tube (RT) of a parabolic trough collector (PTC) is a key effect of optical performance that affects the overall energy performance of the collector. Thermal performance evaluation of the RT relies on the appropriate determination of the irradiance profile. This article explains a technique in which empirical equations were developed to calculate the local irradiance as a function of angular location of the RT of a standard PTC using a vigorously verified Monte Carlo ray tracing model. A large range of test conditions including daily normal insolation, spectral selective coatings and glass envelop conditions were selected from the published data by Dudley et al. [1] for the job. The R2 values of the equations are excellent that vary in between 0.9857 and 0.9999. Therefore, these equations can be used confidently to produce realistic non-uniform boundary heat flux profile around the RT at normal incidence for conjugate heat transfer analyses of the collector. Required values in the equations are daily normal insolation, and the spectral selective properties of the collector components. Since the equations are polynomial functions, data processing software can be employed to calculate the flux profile very easily and quickly. The ultimate goal of this research is to make the concentrating solar power technology cost competitive with conventional energy technology facilitating its ongoing research.
Resumo:
Floods are among the most devastating events that affect primarily tropical, archipelagic countries such as the Philippines. With the current predictions of climate change set to include rising sea levels, intensification of typhoon strength and a general increase in the mean annual precipitation throughout the Philippines, it has become paramount to prepare for the future so that the increased risk of floods on the country does not translate into more economic and human loss. Field work and data gathering was done within the framework of an internship at the former German Technical Cooperation (GTZ) in cooperation with the Local Government Unit of Ormoc City, Leyte, The Philippines, in order to develop a dynamic computer based flood model for the basin of the Pagsangaan River. To this end, different geo-spatial analysis tools such as PCRaster and ArcGIS, hydrological analysis packages and basic engineering techniques were assessed and implemented. The aim was to develop a dynamic flood model and use the development process to determine the required data, availability and impact on the results as case study for flood early warning systems in the Philippines. The hope is that such projects can help to reduce flood risk by including the results of worst case scenario analyses and current climate change predictions into city planning for municipal development, monitoring strategies and early warning systems. The project was developed using a 1D-2D coupled model in SOBEK (Deltares Hydrological modelling software package) and was also used as a case study to analyze and understand the influence of different factors such as land use, schematization, time step size and tidal variation on the flood characteristics. Several sources of relevant satellite data were compared, such as Digital Elevation Models (DEMs) from ASTER and SRTM data, as well as satellite rainfall data from the GIOVANNI server (NASA) and field gauge data. Different methods were used in the attempt to partially calibrate and validate the model to finally simulate and study two Climate Change scenarios based on scenario A1B predictions. It was observed that large areas currently considered not prone to floods will become low flood risk (0.1-1 m water depth). Furthermore, larger sections of the floodplains upstream of the Lilo- an’s Bridge will become moderate flood risk areas (1 - 2 m water depth). The flood hazard maps created for the development of the present project will be presented to the LGU and the model will be used to create a larger set of possible flood prone areas related to rainfall intensity by GTZ’s Local Disaster Risk Management Department and to study possible improvements to the current early warning system and monitoring of the basin section belonging to Ormoc City; recommendations about further enhancement of the geo-hydro-meteorological data to improve the model’s accuracy mainly on areas of interest will also be presented at the LGU.
Resumo:
Parabolic trough concentrator collector is the most matured, proven and widespread technology for the exploitation of the solar energy on a large scale for middle temperature applications. The assessment of the opportunities and the possibilities of the collector system are relied on its optical performance. A reliable Monte Carlo ray tracing model of a parabolic trough collector is developed by using Zemax software. The optical performance of an ideal collector depends on the solar spectral distribution and the sunshape, and the spectral selectivity of the associated components. Therefore, each step of the model, including the spectral distribution of the solar energy, trough reflectance, glazing anti-reflection coating and the absorber selective coating is explained and verified. Radiation flux distribution around the receiver, and the optical efficiency are two basic aspects of optical simulation are calculated using the model, and verified with widely accepted analytical profile and measured values respectively. Reasonably very good agreement is obtained. Further investigations are carried out to analyse the characteristics of radiation distribution around the receiver tube at different insolation, envelop conditions, and selective coating on the receiver; and the impact of scattered light from the receiver surface on the efficiency. However, the model has the capability to analyse the optical performance at variable sunshape, tracking error, collector imperfections including absorber misalignment with focal line and de-focal effect of the absorber, different rim angles, and geometric concentrations. The current optical model can play a significant role in understanding the optical aspects of a trough collector, and can be employed to extract useful information on the optical performance. In the long run, this optical model will pave the way for the construction of low cost standalone photovoltaic and thermal hybrid collector in Australia for small scale domestic hot water and electricity production.
Resumo:
A Remote Sensing Core Curriculum (RSCC) development project is currently underway. This project is being conducted under the auspices of the National Center for Geographic Information and Analysis (NCGIA). RSCC is an outgrowth of the NCGIA GIS Core Curriculum project. It grew out of discussions begun at NCGIA, Initiative 12 (I-12): 'Integration of Remote Sensing and Geographic Information Systems'. This curriculum development project focuses on providing professors, teachers and instructors in undergraduate and graduate institutions with course materials from experts in specific subject matter for areas use in the class room.
Resumo:
The reporting and auditing of patient dose is an important component of radiotherapy quality assurance. The manual extraction of dose-volume metrics is time consuming and undesirable when auditing the dosimetric quality of a large cohort of patient plans. A dose assessment application was written to overcome this, allowing the calculation of various dose-volume metrics for large numbers of plans exported from treatment planning systems. This application expanded on the DICOM-handling functionality of the MCDTK software suite. The software extracts dose values in the volume of interest by using a ray casting point-in-polygon algorithm, where the polygons have been defined by the contours in the RTSTRUCT file...
Resumo:
It is debated that for sustainable STEM education and knowledge investment, human centered learning design approach is critical and important. Sustainability in this context is enduring maintenance of technological trajectories for productive economical and social interactions by demonstrating life critical scenarios through life critical system development and life experiences. Technology influences way of life and the learning and teaching process. Social software application development is more than learning of how to program a software application and extracting information from the Internet. Hence, our research challenge is, how do we attract learners to STEM social software application development? Our realisation processes begin with comparing Science and Technology education in developed (e.g., Australia) and developing (e.g., Sri Lanka) countries with distinction on final year undergraduates’ industry ready training programmes. Principal components analysis was performed to separate patterns of important factors. To measure behavioural intention of perceived usefulness and attitudes of the training, the measurement model was analysed to test its validity and reliability using partial least square (PLS) analysis of structural equation modelling (SEM). Our observation is that the relationship is more complex than we argue for. Our initial conclusions were that life critical system development and life experience trajectories as determinant factors while technological influences were unavoidable. A further investigation should involve correlations between human centered learning design approach and economical development in the long run.
Resumo:
As organisations increasingly engage in the selection, purchase, and adoption of packaged software products, how these activities are carried out in practice becomes increasingly relevant for researchers and practitioners. Our focus in this paper is to propose a framework for understanding the packaged software selection process. The functionalist literature on this area of study suggests a number of generic recommendations, which are based on rational assumptions about the process and view the decision making that takes place as producing the “best technology solution.’” To explore this, we conducted a longitudinal, in-depth study of packaged software selection in a small organisation. For interpretation of the case, we draw upon the Social Construction of Technology, a theoretical framework arguing that technology is socially constituted and regarding the process of development as contradictory and uncertain. We offer a number of contributions. First, we further our understanding of packaged software selection with the critique that we offer of the functionalist literature, drawing insights from the emerging critical/constructivist literature and expanding our domain of interest to encompass the wider environment. Second, we weave this together with our experiences in the field, drawing on social constructivism for theoretical support, to develop a framework of packaged software selection that shows how various actors shape the process.
Resumo:
Introduction A novel realistic 3D virtual reality (VR) application has been developed to allow medical imaging students at Queensland University of Technology to practice radiographic techniques independently outside the usual radiography laboratory. Methods A flexible agile development methodology was used to create the software rapidly and effectively. A 3D gaming environment and realistic models were used to engender presence in the software while tutor-determined gold standards enabled students to compare their performance and learn in a problem-based learning pedagogy. Results Students reported high levels of satisfaction and perceived value and the software enabled up to 40 concurrent users to prepare for clinical practice. Student feedback also indicated that they found 3D to be of limited value in the desktop version compared to the usual 2D approach. A randomised comparison between groups receiving software-based and traditional practice measured performance in a formative role play with real equipment. The results of this work indicated superior performance with the equipment for the VR trained students (P = 0.0366) and confirmed the value of VR for enhancing 3D equipment-based problem-solving skills. Conclusions Students practising projection techniques virtually performed better at role play assessments than students practising in a traditional radiography laboratory only. The application particularly helped with 3D equipment configuration, suggesting that teaching 3D problem solving is an ideal use of such medical equipment simulators. Ongoing development work aims to establish the role of VR software in preparing students for clinical practice with a range of medical imaging equipment.
Resumo:
Students explored variation and expectation in a probability activity at the end of the first year of a 3-year longitudinal study across grades 4-6. The activity involved experiments in tossing coins both manually and with simulation using the graphing software, TinkerPlots. Initial responses indicated that the students were aware of uncertainty, although an understanding of chance concepts appeared limited. Predicting outcomes of 10 tosses reflected an intuitive notion of equiprobability, with little awareness of variation. Understanding the relationship between experimental and theoretical probability did not emerge until multiple outcomes and representations were generated with the software.
Resumo:
Partial evaluation of infrastructure investments have resulted in expensive mistakes, unsatisfactory outcomes and increased uncertainties for too many stakeholders, communities and economies in both developing and developed nations. "Complex Stakeholder Perception Mapping" (CSPM), is a novel approach that can address existing limitations by inclusively framing, capturing and mapping the spectrum of insights and perceptions using extended Geographic Information Systems. Maps generated in CSPM offer presentations of flexibly combined, complex perceptions of stakeholders on multiple aspects of development. CSPM extends the applications of GIS software in non-spatial mapping and of Multi-Criteria Analysis with a multidimensional evaluation platform and augments decision science capabilities in addressing complexities. Application of CSPM can improve local and regional economic gains from infrastructure projects and aid any multi-objective and multi-stakeholder decision situations.
Resumo:
If the land sector is to make significant contributions to mitigating anthropogenic greenhouse gas (GHG) emissions in coming decades, it must do so while concurrently expanding production of food and fiber. In our view, mathematical modeling will be required to provide scientific guidance to meet this challenge. In order to be useful in GHG mitigation policy measures, models must simultaneously meet scientific, software engineering, and human capacity requirements. They can be used to understand GHG fluxes, to evaluate proposed GHG mitigation actions, and to predict and monitor the effects of specific actions; the latter applications require a change in mindset that has parallels with the shift from research modeling to decision support. We compare and contrast 6 agro-ecosystem models (FullCAM, DayCent, DNDC, APSIM, WNMM, and AgMod), chosen because they are used in Australian agriculture and forestry. Underlying structural similarities in the representations of carbon flows though plants and soils in these models are complemented by a diverse range of emphases and approaches to the subprocesses within the agro-ecosystem. None of these agro-ecosystem models handles all land sector GHG fluxes, and considerable model-based uncertainty exists for soil C fluxes and enteric methane emissions. The models also show diverse approaches to the initialisation of model simulations, software implementation, distribution, licensing, and software quality assurance; each of these will differentially affect their usefulness for policy-driven GHG mitigation prediction and monitoring. Specific requirements imposed on the use of models by Australian mitigation policy settings are discussed, and areas for further scientific development of agro-ecosystem models for use in GHG mitigation policy are proposed.
Resumo:
This paper presents on overview of the issues in precisely defining, specifying and evaluating the dependability of software, particularly in the context of computer controlled process systems. Dependability is intended to be a generic term embodying various quality factors and is useful for both software and hardware. While the developments in quality assurance and reliability theories have proceeded mostly in independent directions for hardware and software systems, we present here the case for developing a unified framework of dependability—a facet of operational effectiveness of modern technological systems, and develop a hierarchical systems model helpful in clarifying this view. In the second half of the paper, we survey the models and methods available for measuring and improving software reliability. The nature of software “bugs”, the failure history of the software system in the various phases of its lifecycle, the reliability growth in the development phase, estimation of the number of errors remaining in the operational phase, and the complexity of the debugging process have all been considered to varying degrees of detail. We also discuss the notion of software fault-tolerance, methods of achieving the same, and the status of other measures of software dependability such as maintainability, availability and safety.
Resumo:
The development of innovative methods of stock assessment is a priority for State and Commonwealth fisheries agencies. It is driven by the need to facilitate sustainable exploitation of naturally occurring fisheries resources for the current and future economic, social and environmental well being of Australia. This project was initiated in this context and took advantage of considerable recent achievements in genomics that are shaping our comprehension of the DNA of humans and animals. The basic idea behind this project was that genetic estimates of effective population size, which can be made from empirical measurements of genetic drift, were equivalent to estimates of the successful number of spawners that is an important parameter in process of fisheries stock assessment. The broad objectives of this study were to 1. Critically evaluate a variety of mathematical methods of calculating effective spawner numbers (Ne) by a. conducting comprehensive computer simulations, and by b. analysis of empirical data collected from the Moreton Bay population of tiger prawns (P. esculentus). 2. Lay the groundwork for the application of the technology in the northern prawn fishery (NPF). 3. Produce software for the calculation of Ne, and to make it widely available. The project pulled together a range of mathematical models for estimating current effective population size from diverse sources. Some of them had been recently implemented with the latest statistical methods (eg. Bayesian framework Berthier, Beaumont et al. 2002), while others had lower profiles (eg. Pudovkin, Zaykin et al. 1996; Rousset and Raymond 1995). Computer code and later software with a user-friendly interface (NeEstimator) was produced to implement the methods. This was used as a basis for simulation experiments to evaluate the performance of the methods with an individual-based model of a prawn population. Following the guidelines suggested by computer simulations, the tiger prawn population in Moreton Bay (south-east Queensland) was sampled for genetic analysis with eight microsatellite loci in three successive spring spawning seasons in 2001, 2002 and 2003. As predicted by the simulations, the estimates had non-infinite upper confidence limits, which is a major achievement for the application of the method to a naturally-occurring, short generation, highly fecund invertebrate species. The genetic estimate of the number of successful spawners was around 1000 individuals in two consecutive years. This contrasts with about 500,000 prawns participating in spawning. It is not possible to distinguish successful from non-successful spawners so we suggest a high level of protection for the entire spawning population. We interpret the difference in numbers between successful and non-successful spawners as a large variation in the number of offspring per family that survive – a large number of families have no surviving offspring, while a few have a large number. We explored various ways in which Ne can be useful in fisheries management. It can be a surrogate for spawning population size, assuming the ratio between Ne and spawning population size has been previously calculated for that species. Alternatively, it can be a surrogate for recruitment, again assuming that the ratio between Ne and recruitment has been previously determined. The number of species that can be analysed in this way, however, is likely to be small because of species-specific life history requirements that need to be satisfied for accuracy. The most universal approach would be to integrate Ne with spawning stock-recruitment models, so that these models are more accurate when applied to fisheries populations. A pathway to achieve this was established in this project, which we predict will significantly improve fisheries sustainability in the future. Regardless of the success of integrating Ne into spawning stock-recruitment models, Ne could be used as a fisheries monitoring tool. Declines in spawning stock size or increases in natural or harvest mortality would be reflected by a decline in Ne. This would be good for data-poor fisheries and provides fishery independent information, however, we suggest a species-by-species approach. Some species may be too numerous or experiencing too much migration for the method to work. During the project two important theoretical studies of the simultaneous estimation of effective population size and migration were published (Vitalis and Couvet 2001b; Wang and Whitlock 2003). These methods, combined with collection of preliminary genetic data from the tiger prawn population in southern Gulf of Carpentaria population and a computer simulation study that evaluated the effect of differing reproductive strategies on genetic estimates, suggest that this technology could make an important contribution to the stock assessment process in the northern prawn fishery (NPF). Advances in the genomics world are rapid and already a cheaper, more reliable substitute for microsatellite loci in this technology is available. Digital data from single nucleotide polymorphisms (SNPs) are likely to super cede ‘analogue’ microsatellite data, making it cheaper and easier to apply the method to species with large population sizes.
Resumo:
Forage budgeting, land condition monitoring and maintaining ground cover residuals are critical management practices for the long term sustainability of the northern grazing industry. The aim of this project is to do a preliminary investigation into industry need, feasibility and willingness to adopt a simple to use hand-held hardware device and compatible, integrated software applications that can be used in the paddock by producers, to assist in land condition monitoring and forage budgeting for better Grazing Land Management and to assist with compliance.
Resumo:
Free and Open Source Software (FOSS) has gained increased interest in the computer software industry, but assessing its quality remains a challenge. FOSS development is frequently carried out by globally distributed development teams, and all stages of development are publicly visible. Several product and process-level quality factors can be measured using the public data. This thesis presents a theoretical background for software quality and metrics and their application in a FOSS environment. Information available from FOSS projects in three information spaces are presented, and a quality model suitable for use in a FOSS context is constructed. The model includes both process and product quality metrics, and takes into account the tools and working methods commonly used in FOSS projects. A subset of the constructed quality model is applied to three FOSS projects, highlighting both theoretical and practical concerns in implementing automatic metric collection and analysis. The experiment shows that useful quality information can be extracted from the vast amount of data available. In particular, projects vary in their growth rate, complexity, modularity and team structure.