10 resultados para Code-centric development

em Aston University Research Archive


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present the prototype tool CADS* for the computer-aided development of an important class of self-* systems, namely systems whose components can be modelled as Markov chains. Given a Markov chain representation of the IT components to be included into a self-* system, CADS* automates or aids (a) the development of the artifacts necessary to build the self-* system; and (b) their integration into a fully-operational self-* solution. This is achieved through a combination of formal software development techniques including model transformation, model-driven code generation and dynamic software reconfiguration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most parametric software cost estimation models used today evolved in the late 70's and early 80's. At that time, the dominant software development techniques being used were the early 'structured methods'. Since then, several new systems development paradigms and methods have emerged, one being Jackson Systems Development (JSD). As current cost estimating methods do not take account of these developments, their non-universality means they cannot provide adequate estimates of effort and hence cost. In order to address these shortcomings two new estimation methods have been developed for JSD projects. One of these methods JSD-FPA, is a top-down estimating method, based on the existing MKII function point method. The other method, JSD-COCOMO, is a sizing technique which sizes a project, in terms of lines of code, from the process structure diagrams and thus provides an input to the traditional COCOMO method.The JSD-FPA method allows JSD projects in both the real-time and scientific application areas to be costed, as well as the commercial information systems applications to which FPA is usually applied. The method is based upon a three-dimensional view of a system specification as opposed to the largely data-oriented view traditionally used by FPA. The method uses counts of various attributes of a JSD specification to develop a metric which provides an indication of the size of the system to be developed. This size metric is then transformed into an estimate of effort by calculating past project productivity and utilising this figure to predict the effort and hence cost of a future project. The effort estimates produced were validated by comparing them against the effort figures for six actual projects.The JSD-COCOMO method uses counts of the levels in a process structure chart as the input to an empirically derived model which transforms them into an estimate of delivered source code instructions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It has been recognised for some time that a full code of amino acid-based recognition of DNA sequences would be useful. Several approaches, which utilise small DNA binding motifs called zinc fingers, are presently employed. None of the current approaches successfully combine a combinatorial approach to the elucidation of a code with a single stage high throughput screening assay. The work outlined here describes the development of a model system for the study of DNA protein interactions and the development of a high throughput assay for detection of such interactions. A zinc finger protein was designed which will bind with high affinity and specificity to a known DNA sequence. For future work it is possible to mutate the region of the zinc finger responsible for the specificity of binding, in order to observe the effect on the DNA / protein interactions. The zinc finger protein was initially synthesised as a His tagged product. It was not possible however to develop a high throughput assay using the His tagged zinc finger protein. The gene encoding the zinc finger protein was altered and the protein synthesised as a Glutathione S-Transferase (GST) fusion product. A successful assay was developed using the GST protein and Scintillation Proximity Assay technology (Amersham Pharmacia Biotech). The scintillation proximity assay is a dynamic assay that allows the DNA protein interactions to be studied in "real time". This assay not only provides a high throughput method of screening zinc finger proteins for potential ligands but also allows the effect of addition of reagents or competitor ligands to be monitored.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High velocity oxyfuel (HVOF) thermal spraying is one of the most significant developments in the thermal spray industry since the development of the original plasma spray technique. The first investigation deals with the combustion and discrete particle models within the general purpose commercial CFD code FLUENT to solve the combustion of kerosene and couple the motion of fuel droplets with the gas flow dynamics in a Lagrangian fashion. The effects of liquid fuel droplets on the thermodynamics of the combusting gas flow are examined thoroughly showing that combustion process of kerosene is independent on the initial fuel droplet sizes. The second analysis copes with the full water cooling numerical model, which can assist on thermal performance optimisation or to determine the best method for heat removal without the cost of building physical prototypes. The numerical results indicate that the water flow rate and direction has noticeable influence on the cooling efficiency but no noticeable effect on the gas flow dynamics within the thermal spraying gun. The third investigation deals with the development and implementation of discrete phase particle models. The results indicate that most powder particles are not melted upon hitting the substrate to be coated. The oxidation model confirms that HVOF guns can produce metallic coating with low oxidation within the typical standing-off distance about 30cm. Physical properties such as porosity, microstructure, surface roughness and adhesion strength of coatings produced by droplet deposition in a thermal spray process are determined to a large extent by the dynamics of deformation and solidification of the particles impinging on the substrate. Therefore, is one of the objectives of this study to present a complete numerical model of droplet impact and solidification. The modelling results show that solidification of droplets is significantly affected by the thermal contact resistance/substrate surface roughness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research reported here is an investigation into the problems of social and economic development of a multiethic and multicultural country which has the added challenge of adopting a non-indigenous code to facilitate the development process. Malaysia's power to negotiate outcomes favourable to the interest of the country is critical for the successful attainment of the goals and objectives of VISION2020. Therefore the mechanisms of the human resource development programme have to be efficacious. The three hypotheses of this study are as follows: 1. there is a fear that the problems and challenges posed by the development plans, have been conceptually trivialised; 2. based on (1) above there is a concern that solutions proposed are inadequate and inappropriate and 3. the outcome of both (1) and (2) can lead to the potential underachievement of national goals and objectives. The study proposes a complex model for conceptualising the problem which looks at the relationship between society and language, which any solutions proposed must take into proper consideration. The study looks at the mechanisms available for the smooth absorption of new Malaysian members to new and international communities. A large scale investigation was undertaken with the researcher functioning as a participant observer. An in-depth study of one particular educational ecology yielded approximately 38 hours of interviews and 100 questionnaires. These data were analysed both for explicit information and implicit implications. By some criteria national policies appear to be having the desired effect, and can be given a clean bill of health. By others it is clear that major adjustments would be necessary if the nation is to achieve its objectives in full. Based on the evidence gathered, thr study proposes an apprenticeship approach to training programmes for effective participation of new members in the new ecologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With recent expansions in technology, mobile computing continues to play a vital role in all aspects of our lives. Digital technology tools such as Web browsing, media tracking, social media, and emailing have made mobile technology more than just a means of communication but has widespread use in business and social networks. Developments in Technologies for Human-Centric Mobile Computing and Applications is a comprehensive collection of knowledge and practice in the development of technologies in human –centric mobile technology. This book focuses on the developmental aspects of mobile technology; bringing together researchers, educators, and practitioners to encourage readers to think outside of the box.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: To develop a tool for the accurate reporting and aggregation of findings from each of the multiple methods used in a complex evaluation in an unbiased way. Study Design and Setting: We developed a Method for Aggregating The Reporting of Interventions in Complex Studies (MATRICS) within a gastroenterology study [Evaluating New Innovations in (the delivery and organisation of) Gastrointestinal (GI) endoscopy services by the NHS Modernisation Agency (ENIGMA)]. We subsequently tested it on a different gastroenterology trial [Multi-Institutional Nurse Endoscopy Trial (MINuET)]. We created three layers to define the effects, methods, and findings from ENIGMA. We assigned numbers to each effect in layer 1 and letters to each method in layer 2. We used an alphanumeric code based on layers 1 and 2 to every finding in layer 3 to link the aims, methods, and findings. We illustrated analogous findings by assigning more than one alphanumeric code to a finding. We also showed that more than one effect or method could report the same finding. We presented contradictory findings by listing them in adjacent rows of the MATRICS. Results: MATRICS was useful for the effective synthesis and presentation of findings of the multiple methods from ENIGMA. We subsequently successfully tested it by applying it to the MINuET trial. Conclusion: MATRICS is effective for synthesizing the findings of complex, multiple-method studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an approach to the evaluation of novice programmers' solutions to code writing problems. The first step was the development a framework comprised of the salient elements, or programming constructs, used in a set of student solutions to three typical code writing assessment problems. This framework was then refined to provide a code quality factor framework that was compared with an analysis using the SOLO taxonomy. We found that combining our framework with the SOLO taxonomy helped to define the SOLO categories and provided an improved approach to applying the principles of SOLO to code writing problems. © 2011, Australian Computer Society, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The extant literature on workplace coaching is characterised by a lack of theoretical and empirical understanding regarding the effectiveness of coaching as a learning and development tool; the types of outcomes one can expect from coaching; the tools that can be used to measure coaching outcomes; the underlying processes that explain why and how coaching works and the factors that may impact on coaching effectiveness. This thesis sought to address these substantial gaps in the literature with three linked studies. Firstly, a meta-analysis of workplace coaching effectiveness (k = 17), synthesizing the existing research was presented. A framework of coaching outcomes was developed and utilised to code the studies. Analysis indicated that coaching had positive effects on all outcomes. Next, the framework of outcomes was utilised as the deductive start-point to the development of the scale measuring perceived coaching effectiveness. Utilising a multi-stage approach (n = 201), the analysis indicated that perceived coaching effectiveness may be organised into a six factor structure: career clarity; team performance; work well-being; performance; planning and organizing and personal effectiveness and adaptability. The final study was a longitudinal field experiment to test a theoretical model of individual differences and coaching effectiveness developed in this thesis. An organizational sample of 84 employees each participated in a coaching intervention, completed self-report surveys, and had their job performance rated by peers, direct reports and supervisors (a total of 352 employees provided data on participant performance). The results demonstrate that compared to a control group, the coaching intervention generated a number of positive outcomes. The analysis indicated that coachees’ enthusiasm, intellect and orderliness influenced the impact of coaching on outcomes. Mediation analysis suggested that mastery goal orientation, performance goal orientation and approach motivation in the form of behavioural activation system (BAS) drive, were significant mediators between personality and outcomes. Overall, the findings of this thesis make an original contribution to the understanding of the types of outcomes that can be expected from coaching, and the magnitude of impact coaching has on outcomes. The thesis also provides a tool for reliably measuring coaching effectiveness and a theoretical model to understand the influence of coachee individual differences on coaching outcomes.