915 resultados para software creation methodology


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Competency standards document the knowledge, skills, and attitudes required for competent performance. This study develops competency standards for dietitians in order to substantiate an approach to competency standard development. Focus groups explored the current and emerging purpose, role, and function of the profession, which were used to draft competency standards. Consensus was then sought using two rounds of a Delphi survey. Seven focus groups were conducted with 28 participants (15 employers/practitioners, 5 academics, 8 new graduates). Eighty-two of 110 invited experts participated in round one and 67 experts completed round two. Four major functions of dietitians were identified: being a professional, influencing the health of individuals, groups, communities, and populations through evidence-based nutrition practice, and working collaboratively in teams. Overall there was a high level of consensus on the standards: 93% achieved agreement by participants in round one and all revised standards achieved consensus on round 2. The methodology provides a framework for other professions wishing to embark on competency standard review or development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

DAQ00131 project activities aimed to: conserve tropical grains germplasm under long-term storage conditions; acquire new germplasm with unique traits of interest to clients (particularly to breeding programs); the maintenance of germplasm through viability testing and regeneration; and to increase awareness of the availability of tropical grains germplasm to clients. New project goals were to facilitate the creation of the national grains Genetic Resources Centre (GRC) and included training GRC staff in the use of GRIN-Global (GG), a software management system, so that grains data can be nationalised across Australia; and contribute to the action plan development for the relocation of tropical grains germplasm to both Tamworth and Horsham.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this thesis is to develop a fully automatic lameness detection system that operates in a milking robot. The instrumentation, measurement software, algorithms for data analysis and a neural network model for lameness detection were developed. Automatic milking has become a common practice in dairy husbandry, and in the year 2006 about 4000 farms worldwide used over 6000 milking robots. There is a worldwide movement with the objective of fully automating every process from feeding to milking. Increase in automation is a consequence of increasing farm sizes, the demand for more efficient production and the growth of labour costs. As the level of automation increases, the time that the cattle keeper uses for monitoring animals often decreases. This has created a need for systems for automatically monitoring the health of farm animals. The popularity of milking robots also offers a new and unique possibility to monitor animals in a single confined space up to four times daily. Lameness is a crucial welfare issue in the modern dairy industry. Limb disorders cause serious welfare, health and economic problems especially in loose housing of cattle. Lameness causes losses in milk production and leads to early culling of animals. These costs could be reduced with early identification and treatment. At present, only a few methods for automatically detecting lameness have been developed, and the most common methods used for lameness detection and assessment are various visual locomotion scoring systems. The problem with locomotion scoring is that it needs experience to be conducted properly, it is labour intensive as an on-farm method and the results are subjective. A four balance system for measuring the leg load distribution of dairy cows during milking in order to detect lameness was developed and set up in the University of Helsinki Research farm Suitia. The leg weights of 73 cows were successfully recorded during almost 10,000 robotic milkings over a period of 5 months. The cows were locomotion scored weekly, and the lame cows were inspected clinically for hoof lesions. Unsuccessful measurements, caused by cows standing outside the balances, were removed from the data with a special algorithm, and the mean leg loads and the number of kicks during milking was calculated. In order to develop an expert system to automatically detect lameness cases, a model was needed. A probabilistic neural network (PNN) classifier model was chosen for the task. The data was divided in two parts and 5,074 measurements from 37 cows were used to train the model. The operation of the model was evaluated for its ability to detect lameness in the validating dataset, which had 4,868 measurements from 36 cows. The model was able to classify 96% of the measurements correctly as sound or lame cows, and 100% of the lameness cases in the validation data were identified. The number of measurements causing false alarms was 1.1%. The developed model has the potential to be used for on-farm decision support and can be used in a real-time lameness monitoring system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The past decade has brought a proliferation of statistical genetic (linkage) analysis techniques, incorporating new methodology and/or improvement of existing methodology in gene mapping, specifically targeted towards the localization of genes underlying complex disorders. Most of these techniques have been implemented in user-friendly programs and made freely available to the genetics community. Although certain packages may be more 'popular' than others, a common question asked by genetic researchers is 'which program is best for me?'. To help researchers answer this question, the following software review aims to summarize the main advantages and disadvantages of the popular GENEHUNTER package.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rapid change of technological trends in the marketplace has increased the difficulty for social marketers to reach consumers using traditional marketing channels (Della et al., 2008). Traditionally, social marketing interventions have typically used more conventional supporting products and services such as water counters for water conservation or condoms for sex safety. However, recently social marketers are witnessing the diminishing effectiveness of more traditional social products and services in encouraging the uptake and maintenance of behaviour. In light of the technological trends in the marketplace and diminishing effect of previous social products and services (Lefebvre, 2009), social marketers have been encouraged to look to alternate means of delivering valuable offerings...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A very concise and diversity-oriented approach to rapidly access frondosin-related frameworks from commercially available building blocks is outlined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Excessive speed is a primary contributing factor to young novice road trauma, including intentional and unintentional speeds above posted limits or too fast for conditions. The objective of this research was to conduct a systematic review of recent investigations into novice drivers’ speed selection, with particular attention to applications and limitations of theory and methodology. Method Systematic searches of peer-reviewed and grey literature were conducted during September 2014. Abstract reviews identified 71 references potentially meeting selection criteria of investigations since the year 2000 into factors that influence (directly or indirectly) actual speed (i.e., behaviour or performance) of young (age <25 years) and/or novice (recently-licensed) drivers. Results Full paper reviews resulted in 30 final references: 15 focused on intentional speeding and 15 on broader speed selection investigations. Both sets identified a range of individual (e.g., beliefs, personality) and social (e.g., peer, adult) influences, were predominantly theory-driven and applied cross-sectional designs. Intentional speed investigations largely utilised self-reports while other investigations more often included actual driving (simulated or ‘real world’). The latter also identified cognitive workload and external environment influences, as well as targeted interventions. Discussion and implications Applications of theory have shifted the novice speed-related literature beyond a simplistic focus on intentional speeding as human error. The potential to develop a ‘grand theory’ of intentional speeding emerged and to fill gaps to understand broader speed selection influences. This includes need for future investigations of vehicle-related and physical environment-related influences and methodologies that move beyond cross-sectional designs and rely less on self-reports.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

NeEstimator v2 is a completely revised and updated implementation of software that produces estimates of contemporary effective population size, using several different methods and a single input file. NeEstimator v2 includes three single-sample estimators (updated versions of the linkage disequilibrium and heterozygote-excess methods, and a new method based on molecular coancestry), as well as the two-sample (moment-based temporal) method. New features include the following: (i) an improved method for accounting for missing data; (ii) options for screening out rare alleles; (iii) confidence intervals for all methods; (iv) the ability to analyse data sets with large numbers of genetic markers (10000 or more); (v) options for batch processing large numbers of different data sets, which will facilitate cross-method comparisons using simulated data; and (vi) correction for temporal estimates when individuals sampled are not removed from the population (Plan I sampling). The user is given considerable control over input data and composition, and format of output files. The freely available software has a new JAVA interface and runs under MacOS, Linux and Windows.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Information sharing in distance collaboration: A software engineering perspective, QueenslandFactors in software engineering workgroups such as geographical dispersion and background discipline can be conceptually characterized as "distances", and they are obstructive to team collaboration and information sharing. This thesis focuses on information sharing across multidimensional distances and develops an information sharing distance model, with six core dimensions: geography, time zone, organization, multi-discipline, heterogeneous roles, and varying project tenure. The research suggests that the effectiveness of workgroups may be improved through mindful conducts of information sharing, especially proactive consideration of, and explicit adjustment for, the distances of the recipient when sharing information.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The article describes a new method for obtaining a holographic image of desired magnification, consistent with the stipulated criteria for its resolution and aberrations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reuse of existing carefully designed and tested software improves the quality of new software systems and reduces their development costs. Object-oriented frameworks provide an established means for software reuse on the levels of both architectural design and concrete implementation. Unfortunately, due to frame-works complexity that typically results from their flexibility and overall abstract nature, there are severe problems in using frameworks. Patterns are generally accepted as a convenient way of documenting frameworks and their reuse interfaces. In this thesis it is argued, however, that mere static documentation is not enough to solve the problems related to framework usage. Instead, proper interactive assistance tools are needed in order to enable system-atic framework-based software production. This thesis shows how patterns that document a framework s reuse interface can be represented as dependency graphs, and how dynamic lists of programming tasks can be generated from those graphs to assist the process of using a framework to build an application. This approach to framework specialization combines the ideas of framework cookbooks and task-oriented user interfaces. Tasks provide assistance in (1) cre-ating new code that complies with the framework reuse interface specification, (2) assuring the consistency between existing code and the specification, and (3) adjusting existing code to meet the terms of the specification. Besides illustrating how task-orientation can be applied in the context of using frameworks, this thesis describes a systematic methodology for modeling any framework reuse interface in terms of software patterns based on dependency graphs. The methodology shows how framework-specific reuse interface specifi-cations can be derived from a library of existing reusable pattern hierarchies. Since the methodology focuses on reusing patterns, it also alleviates the recog-nized problem of framework reuse interface specification becoming complicated and unmanageable for frameworks of realistic size. The ideas and methods proposed in this thesis have been tested through imple-menting a framework specialization tool called JavaFrames. JavaFrames uses role-based patterns that specify a reuse interface of a framework to guide frame-work specialization in a task-oriented manner. This thesis reports the results of cases studies in which JavaFrames and the hierarchical framework reuse inter-face modeling methodology were applied to the Struts web application frame-work and the JHotDraw drawing editor framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Free and Open Source Software (FOSS) has gained increased interest in the computer software industry, but assessing its quality remains a challenge. FOSS development is frequently carried out by globally distributed development teams, and all stages of development are publicly visible. Several product and process-level quality factors can be measured using the public data. This thesis presents a theoretical background for software quality and metrics and their application in a FOSS environment. Information available from FOSS projects in three information spaces are presented, and a quality model suitable for use in a FOSS context is constructed. The model includes both process and product quality metrics, and takes into account the tools and working methods commonly used in FOSS projects. A subset of the constructed quality model is applied to three FOSS projects, highlighting both theoretical and practical concerns in implementing automatic metric collection and analysis. The experiment shows that useful quality information can be extracted from the vast amount of data available. In particular, projects vary in their growth rate, complexity, modularity and team structure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Free and open source software development is an alternative to traditional software engineering as an approach to the development of complex software systems. It is a way of developing software based on geographically distributed teams of volunteers without apparent central plan or traditional mechanisms of coordination. The purpose of this thesis is to summarize the current knowledge about free and open source software development and explore the ways on which further understanding on it could be gained. The results of research on the field as well as the research methods are introduced and discussed. Also adapting software process metrics to the context of free and open source software development is illustrated and the possibilities to utilize them as tools to validate other research are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Certain software products employing digital techniques for encryption of data are subject to export controls in the EU Member States pursuant to Community law and relevant laws in the Member States. These controls are agreed globally in the framework of the so-called Wassenaar Arrangement. Wassenaar is an informal non-proliferation regime aimed at promoting international stability and responsibility in transfers of strategic (dual-use) products and technology. This thesis covers provisions of Wassenaar, Community export control laws and export control laws of Finland, Sweden, Germany, France and United Kingdom. This thesis consists of five chapters. The first chapter discusses the ratio of export control laws and the impact they have on global trade. The ratio is originally defence-related - in general to prevent potential adversaries of participating States from having the same tools, and in particular in the case of cryptographic software to enable signals intelligence efforts. Increasingly as the use of cryptography in a civilian context has mushroomed, export restrictions can have negative effects on civilian trade. Information security solutions may also be took weak because of export restrictions on cryptography. The second chapter covers the OECD's Cryptography Policy, which had a significant effect on its member nations' national cryptography policies and legislation. The OECD is a significant organization,because it acts as a meeting forum for most important industrialized nations. The third chapter covers the Wassenaar Arrangement. The Arrangement is covered from the viewpoint of international law and politics. The Wassenaar control list provisions affecting cryptographic software transfers are also covered in detail. Control lists in the EU and in Member States are usually directly copied from Wassenaar control lists. Controls agreed in its framework set only a minimum level for participating States. However, Wassenaar countries can adopt stricter controls. The fourth chapter covers Community export control law. Export controls are viewed in Community law as falling within the domain of Common Commercial Policy pursuant to Article 133 of the EC Treaty. Therefore the Community has exclusive competence in export matters, save where a national measure is authorized by the Community or falls under foreign or security policy derogations established in Community law. The Member States still have a considerable amount of power in the domain of Common Foreign and Security Policy. They are able to maintain national export controls because export control laws are not fully harmonized. This can also have possible detrimental effects on the functioning of internal market and common export policies. In 1995 the EU adopted Dual-Use Regulation 3381/94/EC, which sets common rules for exports in Member States. Provisions of this regulation receive detailed coverage in this chapter. The fifth chapter covers national legislation and export authorization practices in five different Member States - in Finland, Sweden, Germany, France and in United Kingdom. Export control laws of those Member States are covered when the national laws differ from the uniform approach of the Community's acquis communautaire. Keywords: export control, encryption, software, dual-use, license, foreign trade, e-commerce, Internet

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern Christian theology has been at pain with the schism between the Bible and theology, and between biblical studies and systematic theology. Brevard Springs Childs is one of biblical scholars who attempt to dismiss this “iron curtain” separating the two disciplines. The present thesis aims at analyzing Childs’ concept of theological exegesis in the canonical context. In the present study I employ the method of systematic analysis. The thesis consists of seven chapters. Introduction is the first chapter. The second chapter attempts to find out the most important elements which exercise influence on Childs’ methodology of biblical theology by sketching his academic development during his career. The third chapter attempts to deal with the crucial question why and how the concept of the canon is so important for Childs’ methodology of biblical theology. In chapter four I analyze why and how Childs is dissatisfied with historical-critical scholarship and I point out the differences and similarities between his canonical approach and historical criticism. The fifth chapter attempts at discussing Childs’ central concepts of theological exegesis by investigating whether a Christocentric approach is an appropriate way of creating a unified biblical theology. In the sixth chapter I present a critical evaluation and methodological reflection of Childs’ theological exegesis in the canonical context. The final chapter sums up the key points of Childs’ methodology of biblical theology. The basic results of this thesis are as follows: First, the fundamental elements of Childs’ theological thinking are rooted in Reformed theological tradition and in modern theological neo-orthodoxy and in its most prominent theologian, Karl Barth. The American Biblical Theological Movement and the controversy between Protestant liberalism and conservatism in the modern American context cultivate his theological sensitivity and position. Second, Childs attempts to dismiss negative influences of the historical-critical method by establishing canon-based theological exegesis leading into confessional biblical theology. Childs employs terminology such as canonical intentionality, the wholeness of the canon, the canon as the most appropriate context for doing a biblical theology, and the continuity of the two Testaments, in order to put into effect his canonical program. Childs demonstrates forcefully the inadequacies of the historical-critical method in creating biblical theology in biblical hermeneutics, doctrinal theology, and pastoral practice. His canonical approach endeavors to establish and create post-critical Christian biblical theology, and works within the traditional framework of faith seeking understanding. Third, Childs’ biblical theology has a double task: descriptive and constructive, the former connects biblical theology with exegesis, the later with dogmatic theology. He attempts to use a comprehensive model, which combines a thematic investigation of the essential theological contents of the Bible with a systematic analysis of the contents of the Christian faith. Childs also attempts to unite Old Testament theology and New Testament theology into one unified biblical theology. Fourth, some problematic points of Childs’ thinking need to be mentioned. For instance, his emphasis on the final form of the text of the biblical canon is highly controversial, yet Childs firmly believes in it, he even regards it as the corner stone of his biblical theology. The relationship between the canon and the doctrine of biblical inspiration is weak. He does not clearly define whether Scripture is God’s word or whether it only “witnesses” to it. Childs’ concepts of “the word of God” and “divine revelation” remain unclear, and their ontological status is ambiguous. Childs’ theological exegesis in the canonical context is a new attempt in the modern history of Christian theology. It expresses his sincere effort to create a path for doing biblical theology. Certainly, it was just a modest beginning of a long process.