876 resultados para Computer software - Development
Resumo:
APSIM-ORYZA is a new functionality developed in the APSIM framework to simulate rice production while addressing management issues such as fertilisation and transplanting, which are particularly important in Korean agriculture. To validate the model for Korean rice varieties and field conditions, the measured yields and flowering times from three field experiments conducted by the Gyeonggi Agricultural Research and Extension Services (GARES) in Korea were compared against the simulated outputs for different management practices and rice varieties. Simulated yields of early-, mid- and mid-to-late-maturing varieties of rice grown in a continuous rice cropping system from 1997 to 2004 showed close agreement with the measured data. Similar results were also found for yields simulated under seven levels of nitrogen application. When different transplanting times were modelled, simulated flowering times ranged from within 3 days of the measured values for the early-maturing varieties, to up to 9 days after the measured dates for the mid- and especially mid-to-late-maturing varieties. This was associated with highly variable simulated yields which correlated poorly with the measured data. This suggests the need to accurately calibrate the photoperiod sensitivity parameters of the model for the photoperiod-sensitive rice varieties in Korea.
Resumo:
Wilmot Senaratne, Bill Palmer and Bob Sutherst recently published their paper 'Applications of CLIMEX modelling leading to improved biological control' in Proceedings of the 16th Australian Weeds Conference. They looked at three examples where modern climate matching techniques using computer software produces decisions and results than might happen using previous techniques such as climadiagrams. Assessment of climatic suitability is important at various stages of a biological control project; from initial foreign exploration, to risk assessment in preparation for the release of a particular agent, through to selection of release sites that maximise the agent´s chances of initial establishment. It is now also necessary to predict potential future distributions of both target weeds and agents under climate change.
Resumo:
Introduction: Extreme heat events (both heat waves and extremely hot days) are increasing in frequency and duration globally and cause more deaths in Australia than any other extreme weather event. Numerous studies have demonstrated a link between extreme heat events and an increased risk of morbidity and death. In this study, the researchers sought to identify if extreme heat events in the Tasmanian population were associated with any changes in emergency department admissions to the Royal Hobart Hospital (RHH) for the period 2003-2010. Methods: Non-identifiable RHH emergency department data and climate data from the Australian Bureau of Meteorology were obtained for the period 2003-2010. Statistical analyses were conducted using the computer statistical computer software ‘R’ with a distributed lag non-linear model (DLNM) package used to fit a quassi-Poisson generalised linear regression model. Results: This study showed that RR of admission to RHH during 2003-2010 was significant over temperatures of 24 C with a lag effect lasting 12 days and main effect noted one day after the extreme heat event. Discussion: This study demonstrated that extreme heat events have a significant impact on public hospital admissions. Two limitations were identified: admissions data rather than presentations data were used and further analysis could be done to compare types of admissions and presentations between heat and non-heat events. Conclusion: With the impacts of climate change already being felt in Australia, public health organisations in Tasmania and the rest of Australia need to implement adaptation strategies to enhance resilience to protect the public from the adverse health effects of heat events and climate change.
Resumo:
Flood extent mapping is a basic tool for flood damage assessment, which can be done by digital classification techniques using satellite imageries, including the data recorded by radar and optical sensors. However, converting the data into the information we need is not a straightforward task. One of the great challenges involved in the data interpretation is to separate the permanent water bodies and flooding regions, including both the fully inundated areas and the wet areas where trees and houses are partly covered with water. This paper adopts the decision fusion technique to combine the mapping results from radar data and the NDVI data derived from optical data. An improved capacity in terms of identifying the permanent or semi-permanent water bodies from flood inundated areas has been achieved. Computer software tools Multispec and Matlab were used.
Resumo:
This article draws on the design and implementation of three mobile learning projects introduced by Flanagan in 2011, 2012 and 2014 engaging a total of 206 participants. The latest of these projects is highlighted in this article. Two other projects provide additional examples of innovative strategies to engage mobile and cloud systems describing how electronic and mobile technology can help facilitate teaching and learning, assessment for learning and assessment as learning, and support communities of practice. The second section explains the theoretical premise supporting the implementation of technology and promulgates a hermeneutic phenomenological approach. The third section discusses mobility, both in terms of the exploration of wearable technology in the prototypes developed as a result of the projects, and the affordances of mobility within pedagogy. Finally the quantitative and qualitative methods in place to evaluate m-learning are explained.
Resumo:
The open access (OA) model for journals is compared to the open source principle for computer software. Since the early 1990s nearly 1,000 OA scientific journals have emerged – mostly as voluntary community efforts, although recently some professionally operating publishers have used author charges or institutional membership. This study of OA journals without author charges shows that their impact is still relatively small, but awareness of it is increasing. The average number of research articles per year is lower than for major scientific journals but the publication times are shorter.
Resumo:
Although previous research has recognised adaptation as a central aspect in relationships, the adaptation of the sales process to the buying process has not been studied. Furthermore, the linking of relationship orientation as mindset with adaptation as a strategy and forming the means has not been elaborated upon in previous research. Adaptation in the context of relationships has mostly been studied in relationship marketing. In sales and sales management research, adaptation has been studied with reference to personal selling. This study focuses on adaptation of the sales process to strategically match it to the buyer’s mindset and buying process. The purpose of this study is to develop a framework for strategic adaptation of the seller’s sales process to match the buyer’s buying process in a business-to-business context to make sales processes more relationship oriented. In order to arrive at a holistic view of adaptation of the sales process during relationship initiation, both the seller and buyer are included in an extensive case analysed in the study. However, the selected perspective is primarily that of the seller, and the level focused on is that of the sales process. The epistemological perspective adopted is constructivism. The study is a qualitative one applying a retrospective case study, where the main sources of information are in-depth semi-structured interviews with key informants representing the counterparts at the seller and the buyer in the software development and telecommunications industries. The main theoretical contributions of this research involve targeting a new area in the crossroads of relationship marketing, sales and sales management, and buying and purchasing by studying adaptation in a business-to-business context from a new perspective. Primarily, this study contributes to research in sales and sales management with reference to relationship orientation and strategic sales process adaptation. This research fills three research gaps. Firstly, linking the relationship orientation mindset with adaptation as strategy. Secondly, extending adaptation in sales from adaptation in selling to strategic adaptation of the sales process. Thirdly, extending adaptation to include facilitation of adaptation. The approach applied in the study, systematic combining, is characterised by continuously moving back and forth between theory and empirical data. The framework that emerges, in which linking mindset with strategy with mindset and means forms a central aspect, includes three layers: purchasing portfolio, seller-buyer relationship orientation, and strategic sales process adaptation. Linking the three layers enables an analysis of where sales process adaptation can make a contribution. Furthermore, implications for managerial use are demonstrated, for example how sellers can avoid the ‘trap’ of ad-hoc adaptation. This includes involving the company, embracing the buyer’s purchasing portfolio, understanding the current position that the seller has in this portfolio, and possibly educating the buyer about advantages of adopting a relationship-oriented approach.
Resumo:
Heat exchanger design is a complex task involving the selection of a large number of interdependent design parameters. There are no established general techniques for optimizing the design, though a few earlier attempts provide computer software based on gradient methods, case study methods, etc. The authors felt that it would be useful to determine the nature of the optimal and near-optimal feasible designs to devise an optimization technique. Therefore, in this article they have obtained a large number of feasible designs of shell and tube heat exchangers, intended to perform a given heat duty, by an exhaustive search method. They have studied how their capital and operating costs varied. The study reveals several interesting aspects of the dependence of capital and total costs on various design parameters. The authors considered a typical shell and tube heat exchanger used in an oil refinery. Its heat duty, inlet temperature and other details are given.
Resumo:
A fatigue crack growth rate study has been carried out on L-72 aluminium alloy plate specimens with and without cold worked holes. The cold worked specimens showed significantly increased fatigue life compared to unworked specimens. Computer software is developed to evaluate the stress intensity factor for non-uniform stress distributions using Green's function approach. The exponents for the Paris equation in the stable crack growth region for cold worked and unworked specimens are 1.26 and 3.15 respectively. The reduction in exponent value indicates the retardation in crack growth rate. An SEM study indicates more plastic deformation at the edge of the hole for unworked samples as compared to the worked samples during the crack initiation period.
Resumo:
Today's SoCs are complex designs with multiple embedded processors, memory subsystems, and application specific peripherals. The memory architecture of embedded SoCs strongly influences the power and performance of the entire system. Further, the memory subsystem constitutes a major part (typically up to 70%) of the silicon area for the current day SoC. In this article, we address the on-chip memory architecture exploration for DSP processors which are organized as multiple memory banks, where banks can be single/dual ported with non-uniform bank sizes. In this paper we propose two different methods for physical memory architecture exploration and identify the strengths and applicability of these methods in a systematic way. Both methods address the memory architecture exploration for a given target application by considering the application's data access characteristics and generates a set of Pareto-optimal design points that are interesting from a power, performance and VLSI area perspective. To the best of our knowledge, this is the first comprehensive work on memory space exploration at physical memory level that integrates data layout and memory exploration to address the system objectives from both hardware design and application software development perspective. Further we propose an automatic framework that explores the design space identifying 100's of Pareto-optimal design points within a few hours of running on a standard desktop configuration.
Resumo:
In March 2012, the authors met at the National Evolutionary Synthesis Center (NESCent) in Durham, North Carolina, USA, to discuss approaches and cooperative ventures in Indo-Pacific phylogeography. The group emerged with a series of findings: (1) Marine population structure is complex, but single locus mtDNA studies continue to provide powerful first assessment of phylogeographic patterns. (2) These patterns gain greater significance/power when resolved in a diversity of taxa. New analytical tools are emerging to address these analyses with multi-taxon approaches. (3) Genome-wide analyses are warranted if selection is indicated by surveys of standard markers. Such indicators can include discordance between genetic loci, or between genetic loci and morphology. Phylogeographic information provides a valuable context for studies of selection and adaptation. (4) Phylogeographic inferences are greatly enhanced by an understanding of the biology and ecology of study organisms. (5) Thorough, range-wide sampling of taxa is the foundation for robust phylogeographic inference. (6) Congruent geographic and taxonomic sampling by the Indo-Pacific community of scientists would facilitate better comparative analyses. The group concluded that at this stage of technology and software development, judicious rather than wholesale application of genomics appears to be the most robust course for marine phylogeographic studies. Therefore, our group intends to affirm the value of traditional (''unplugged'') approaches, such as those based on mtDNA sequencing and microsatellites, along with essential field studies, in an era with increasing emphasis on genomic approaches.
Resumo:
Reuse is at the heart of major improvements in productivity and quality in Software Engineering. Both Model Driven Engineering (MDE) and Software Product Line Engineering (SPLE) are software development paradigms that promote reuse. Specifically, they promote systematic reuse and a departure from craftsmanship towards an industrialization of the software development process. MDE and SPLE have established their benefits separately. Their combination, here called Model Driven Product Line Engineering (MDPLE), gathers together the advantages of both. Nevertheless, this blending requires MDE to be recasted in SPLE terms. This has implications on both the core assets and the software development process. The challenges are twofold: (i) models become central core assets from which products are obtained and (ii) the software development process needs to cater for the changes that SPLE and MDE introduce. This dissertation proposes a solution to the first challenge following a feature oriented approach, with an emphasis on reuse and early detection of inconsistencies. The second part is dedicated to assembly processes, a clear example of the complexity MDPLE introduces in software development processes. This work advocates for a new discipline inside the general software development process, i.e., the Assembly Plan Management, which raises the abstraction level and increases reuse in such processes. Different case studies illustrate the presented ideas.
Resumo:
Traditional software development captures the user needs during the requirement analysis. The Web makes this endeavour even harder due to the difficulty to determine who these users are. In an attempt to tackle the heterogeneity of the user base, Web Personalization techniques are proposed to guide the users’ experience. In addition, Open Innovation allows organisations to look beyond their internal resources to develop new products or improve existing processes. This thesis sits in between by introducing Open Personalization as a means to incorporate actors other than webmasters in the personalization of web applications. The aim is to provide the technological basis that builds up a trusty environment for webmasters and companion actors to collaborate, i.e. "an architecture of participation". Such architecture very much depends on these actors’ profile. This work tackles three profiles (i.e. software partners, hobby programmers and end users), and proposes three "architectures of participation" tuned for each profile. Each architecture rests on different technologies: a .NET annotation library based on Inversion of Control for software partners, a Modding Interface in JavaScript for hobby programmers, and finally, a domain specific language for end-users. Proof-of-concept implementations are available for the three cases while a quantitative evaluation is conducted for the domain specific language.
Resumo:
The Alliance for Coastal Technology (ACT) convened a workshop on the in situ measurement of dissolved inorganic carbon species in natural waters in Honolulu, Hawaii, on February 16, 17, and 18, 2005. The workshop was designed to summarize existing technologies for measuring the abundance and speciation of dissolved inorganic carbon and to make strategic recommendations for future development and application of these technologies to coastal research and management. The workshop was not focused on any specific technology, however, most of the attention of the workshop was on in situ pC02 sensors given their recent development and use on moorings for the measurement of global carbon fluxes. In addition, the problems and limitations arising from the long-term deployment of systems designed for the measurement of pH, total dissolved inorganic carbon (DIC), and total alkalinity (TA) were discussed. Participants included researchers involved in carbon biogeochemistry, industry representatives, and coastal resource managers. The primary questions asked during the workshop were: I. What are the major impediments to transform presently used shipboard pC02 measurement systems for use on cost-eficient moorings? 2. What are the major technical hurdles for the in situ measurement of TA and DIC? 3. What specific information do we need to coordinate efforts for proof of concept' testing of existing and new technologies, inter-calibration of those technologies, better software development, and more precise knowledge quantzjjing the geochemistry of dissolved inoeanic carbon species in order to develop an observing system for dissolved inorganic carbon? Based on the discussion resulting from these three questions, the following statements were made: Statement No. 1 Cost-effective, self-contained technologies for making long-term, accurate measurements of the partial pressure of C02 gas in water already exist and at present are ready for deployment on moorings in coastal observing systems. Statement No. 2 Cost-effective, self-contained systems for the measurement of pH, TA, and DIC are still needed to both fully define the carbonate chemistry of coastal waters and the fluxes of carbon between major biogeochemical compartments (e.g., air-sea, shelf-slope, water column-sediment, etc.). (pdf contains 23 pages)