984 resultados para Code-centric development
Resumo:
The capabilities and thus, design complexity of VLSI-based embedded systems have increased tremendously in recent years, riding the wave of Moore’s law. The time-to-market requirements are also shrinking, imposing challenges to the designers, which in turn, seek to adopt new design methods to increase their productivity. As an answer to these new pressures, modern day systems have moved towards on-chip multiprocessing technologies. New architectures have emerged in on-chip multiprocessing in order to utilize the tremendous advances of fabrication technology. Platform-based design is a possible solution in addressing these challenges. The principle behind the approach is to separate the functionality of an application from the organization and communication architecture of hardware platform at several levels of abstraction. The existing design methodologies pertaining to platform-based design approach don’t provide full automation at every level of the design processes, and sometimes, the co-design of platform-based systems lead to sub-optimal systems. In addition, the design productivity gap in multiprocessor systems remain a key challenge due to existing design methodologies. This thesis addresses the aforementioned challenges and discusses the creation of a development framework for a platform-based system design, in the context of the SegBus platform - a distributed communication architecture. This research aims to provide automated procedures for platform design and application mapping. Structural verification support is also featured thus ensuring correct-by-design platforms. The solution is based on a model-based process. Both the platform and the application are modeled using the Unified Modeling Language. This thesis develops a Domain Specific Language to support platform modeling based on a corresponding UML profile. Object Constraint Language constraints are used to support structurally correct platform construction. An emulator is thus introduced to allow as much as possible accurate performance estimation of the solution, at high abstraction levels. VHDL code is automatically generated, in the form of “snippets” to be employed in the arbiter modules of the platform, as required by the application. The resulting framework is applied in building an actual design solution for an MP3 stereo audio decoder application.
Resumo:
Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Innovative gas cooled reactors, such as the pebble bed reactor (PBR) and the gas cooled fast reactor (GFR) offer higher efficiency and new application areas for nuclear energy. Numerical methods were applied and developed to analyse the specific features of these reactor types with fully three dimensional calculation models. In the first part of this thesis, discrete element method (DEM) was used for a physically realistic modelling of the packing of fuel pebbles in PBR geometries and methods were developed for utilising the DEM results in subsequent reactor physics and thermal-hydraulics calculations. In the second part, the flow and heat transfer for a single gas cooled fuel rod of a GFR were investigated with computational fluid dynamics (CFD) methods. An in-house DEM implementation was validated and used for packing simulations, in which the effect of several parameters on the resulting average packing density was investigated. The restitution coefficient was found out to have the most significant effect. The results can be utilised in further work to obtain a pebble bed with a specific packing density. The packing structures of selected pebble beds were also analysed in detail and local variations in the packing density were observed, which should be taken into account especially in the reactor core thermal-hydraulic analyses. Two open source DEM codes were used to produce stochastic pebble bed configurations to add realism and improve the accuracy of criticality calculations performed with the Monte Carlo reactor physics code Serpent. Russian ASTRA criticality experiments were calculated. Pebble beds corresponding to the experimental specifications within measurement uncertainties were produced in DEM simulations and successfully exported into the subsequent reactor physics analysis. With the developed approach, two typical issues in Monte Carlo reactor physics calculations of pebble bed geometries were avoided. A novel method was developed and implemented as a MATLAB code to calculate porosities in the cells of a CFD calculation mesh constructed over a pebble bed obtained from DEM simulations. The code was further developed to distribute power and temperature data accurately between discrete based reactor physics and continuum based thermal-hydraulics models to enable coupled reactor core calculations. The developed method was also found useful for analysing sphere packings in general. CFD calculations were performed to investigate the pressure losses and heat transfer in three dimensional air cooled smooth and rib roughened rod geometries, housed inside a hexagonal flow channel representing a sub-channel of a single fuel rod of a GFR. The CFD geometry represented the test section of the L-STAR experimental facility at Karlsruhe Institute of Technology and the calculation results were compared to the corresponding experimental results. Knowledge was gained of the adequacy of various turbulence models and of the modelling requirements and issues related to the specific application. The obtained pressure loss results were in a relatively good agreement with the experimental data. Heat transfer in the smooth rod geometry was somewhat under predicted, which can partly be explained by unaccounted heat losses and uncertainties. In the rib roughened geometry heat transfer was severely under predicted by the used realisable k − epsilon turbulence model. An additional calculation with a v2 − f turbulence model showed significant improvement in the heat transfer results, which is most likely due to the better performance of the model in separated flow problems. Further investigations are suggested before using CFD to make conclusions of the heat transfer performance of rib roughened GFR fuel rod geometries. It is suggested that the viewpoints of numerical modelling are included in the planning of experiments to ease the challenging model construction and simulations and to avoid introducing additional sources of uncertainties. To facilitate the use of advanced calculation approaches, multi-physical aspects in experiments should also be considered and documented in a reasonable detail.
Resumo:
The purpose of this study was to expand the applicability of supplier segmentation and development approaches to the project-driven construction industry. These practices are less exploited and not well documented in this operational environment compared to the process-centric manufacturing industry. At first, portfolio models to supply base segmentation and various supplier development efforts were investigated in literature review. A step-wise framework was structured for the empirical research. The empirical study employed multiple research methods in three case studies in a large Finnish construction company. The first study categorized the construction item classes into the purchasing portfolio and positioned suppliers to the power matrix by investigating buyer-supplier relations. Using statistical tests, the study also identified factors that affect suppliers’ performance. The final case study identified improvement areas of the interface between a main contractor and one if its largest suppliers. The final results indicate that only by assessing the supply base in a holistic manner and the power circumstances in it, buyers comprehend how to best establish appropriate supplier development strategies in the project environment.
Resumo:
Production of a new system in any range is expanding dramatically and new ideas are there upon introduced, the logic stands behind the matter is the growth of application of the internet and granting web-based systems. Before producing a system and distribute to the customer, various aspects should be studied which multiple the profit of the system. The process of productizing a new system from being unprocessed idea until delivers to the final user has been unambiguous. In this thesis, the systematize service in a way that benefits both the customer and provider, along with an effort to establish trust and diminish customer’s risk and increase service productivity are in detail presented. Characteristics of Servitization and Productization as two faces of one coin have been interpreted. Apart from the abovementioned issues state of art, service-oriented architecture (SOA) and New Service Development (NSD) has been included in this report for solving the problem of gradually decline in value of companies.
Resumo:
The use of certain perfonnance enhancing substances and methods has been defined as a major ethical breach by parties involved in the governance of highperfonnance sport. As a result, elite athletes worldwide are subject to rules and regulations set out in international and national anti-doping policies. Existing literature on the development of policies such as the World Anti-Doping Code and The Canadian antiDoping Program suggests a sport system in which athletes are rarely meaningfully involved in policy development (Houlihan, 2004a). Additionally, it is suggested that this lack of involvement is reflective of a similar lack of involvement in other areas of governance concerning athletes' lives. The purpose ofthis thesis is to examine the history and current state of athletes' involvement in the anti-doping policy process in Canada's high-perfonnance sport system. It includes discussion and analysis of recently conducted interviews with those involved in the policy process as well as an analysis of relevant documents, including anti-doping policies. The findings demonstrate that Canadian athletes have not been significantly involved in the creation of recently developed antidoping policies and that a re-evaluation of current policies is necessary to more fully recognize the reality of athletes' lives in Canada's high-perfonnance sport system and their rights within that system.
Resumo:
La famille des gènes Hox code pour des facteurs de transcription connus pour leur contribution essentielle à l’élaboration de l’architecture du corps et ce, au sein de tout le règne animal. Au cours de l’évolution chez les vertébrés, les gènes Hox ont été redéfinis pour générer toute une variété de nouveaux tissus/organes. Souvent, cette diversification s’est effectuée via des changements quant au contrôle transcriptionnel des gènes Hox. Chez les mammifères, la fonction de Hoxa13 n’est pas restreinte qu’à l’embryon même, mais s’avère également essentielle pour le développement de la vascularisation fœtale au sein du labyrinthe placentaire, suggérant ainsi que sa fonction au sein de cette structure aurait accompagné l’émergence des espèces placentaires. Au chapitre 2, nous mettons en lumière le recrutement de deux autres gènes Hoxa, soient Hoxa10 et Hoxa11, au compartiment extra-embryonnaire. Nous démontrons que l’expression de Hoxa10, Hoxa11 et Hoxa13 est requise au sein de l’allantoïde, précurseur du cordon ombilical et du système vasculaire fœtal au sein du labyrinthe placentaire. De façon intéressante, nous avons découvert que l’expression des gènes Hoxa10-13 dans l’allantoïde n’est pas restreinte qu’aux mammifères placentaires, mais est également présente chez un vertébré non-placentaire, indiquant que le recrutement des ces gènes dans l’allantoïde précède fort probablement l’émergence des espèces placentaires. Nous avons généré des réarrangements génétiques et utilisé des essais transgéniques pour étudier les mécanismes régulant l’expression des gènes Hoxa dans l’allantoïde. Nous avons identifié un fragment intergénique de 50 kb capable d’induire l’expression d’un gène rapporteur dans l’allantoïde. Cependant, nous avons trouvé que le mécanisme de régulation contrôlant l’expression du gène Hoxa au sein du compartiment extra-embryonnaire est fort complexe et repose sur plus qu’un seul élément cis-régulateur. Au chapitre 3, nous avons utilisé la cartographie génétique du destin cellulaire pour évaluer la contribution globale des cellules exprimant Hoxa13 aux différentes structures embryonnaires. Plus particulièrement, nous avons examiné plus en détail l’analyse de la cartographie du destin cellulaire de Hoxa13 dans les pattes antérieures en développement. Nous avons pu déterminer que, dans le squelette du membre, tous les éléments squelettiques de l’autopode (main), à l’exception de quelques cellules dans les éléments carpiens les plus proximaux, proviennent des cellules exprimant Hoxa13. En contraste, nous avons découvert que, au sein du compartiment musculaire, les cellules exprimant Hoxa13 et leurs descendantes (Hoxa13lin+) s’étendent à des domaines plus proximaux du membre, où ils contribuent à générer la plupart des masses musculaires de l’avant-bras et, en partie, du triceps. De façon intéressante, nous avons découvert que les cellules exprimant Hoxa13 et leurs descendantes ne sont pas distribuées uniformément parmi les différents muscles. Au sein d’une même masse musculaire, les fibres avec une contribution Hoxa13lin+ différente peuvent être identifiées et les fibres avec une contribution semblable sont souvent regroupées ensemble. Ce résultat évoque la possibilité que Hoxa13 soit impliqué dans la mise en place de caractéristiques spécifiques des groupes musculaires, ou la mise en place de connections nerf-muscle. Prises dans leur ensemble, les données ici présentées permettent de mieux comprendre le rôle de Hoxa13 au sein des compartiments embryonnaires et extra-embryonnaires. Par ailleurs, nos résultats seront d’une importance primordiale pour soutenir les futures études visant à expliquer les mécanismes transcriptionnels soutenant la régulation des gènes Hoxa dans les tissus extra-embryonnaires.
Resumo:
Il existe actuellement de nombreuses preuves démontrant que des facteurs génétiques et environnementaux interagissent pendant des périodes spécifiques du développement pour rendre une personne vulnérable aux troubles psychologiques via diverses adaptations physiologiques. Cette thèse porte sur l'impact de l’adversité prénatale (représentée par le petit poids à la naissance, PPN) et de l’adversité postnatale précoce (symptômes dépressifs maternels et comportements maternels négatifs), sur le développement du cerveau, particulièrement les régions fronto-limbiques impliquées dans le traitement des émotions, pendant l'enfance et l'adolescence. Des jumeaux monozygotes (MZ) sont utilisés, lorsque possible, afin de contrôler pour les effets génétiques. Les chapitres 1 et 2 présentent les résultats de la vérification de l'hypothèse que l’adversité prénatale et postnatale précoce sont associées à une altération du fonctionnement des régions fronto-limbique tels que l’amygdale, l’hippocampe, l’insula, le cortex cingulaire antérieur et le cortex préfrontal, en réponse à des stimuli émotifs chez des enfants et des adolescents. On observe que les symptômes dépressifs maternels sont associés à une activation plus élevée des régions fronto-limbiques des enfants en réponse à la tristesse. Les résultats de l’étude avec des adolescents suggèrent que le PPN, les symptômes dépressifs et les comportements maternels négatifs sont associés à une fonction altérée des régions fronto-limbiques en réponse à des stimuli émotionnels. Chez les jumeaux MZ on observe également que la discordance intra-paire de PPN et de certains comportements maternels est associée à une discordance intra-paire du fonctionnement du cerveau et que ces altérations diffèrent selon le sexe. Le chapitre 3 présente les résultats de la vérification de l'hypothèse que l’adversité prénatale et postnatale précoce sont associées à un volume total réduit du cerveau et de l’hypothèse que les comportements maternels peuvent servir de médiateur ou de modérateur de l'association entre le PPN et le volume du cerveau. Avec des jumeaux MZ à l’adolescence on observe a) que le PPN est effectivement associé à une diminution du volume total du cerveau et b) que la discordance intra-paire de PPN est associée à une discordance du volume du cerveau. En somme, cette thèse présente un ensemble de résultats qui soutiennent deux hypothèses importantes pour comprendre les effets de l’environnement sur le développement du cerveau : que l’environnement prénatal et postnatal précoce ont un impact sur le développement du cerveau indépendamment du code génétique et que les mécanismes impliqués peuvent différer entre les garçons et les filles. Finalement, l’ensemble de ces résultats sont discutés à la lumière des autres travaux de recherche dans ce domaine et des avenues à explorer pour de la recherche ultérieure sont proposées.
Resumo:
La révision du code est un procédé essentiel quelque soit la maturité d'un projet; elle cherche à évaluer la contribution apportée par le code soumis par les développeurs. En principe, la révision du code améliore la qualité des changements de code (patches) avant qu'ils ne soient validés dans le repertoire maître du projet. En pratique, l'exécution de ce procédé n'exclu pas la possibilité que certains bugs passent inaperçus. Dans ce document, nous présentons une étude empirique enquétant la révision du code d'un grand projet open source. Nous investissons les relations entre les inspections des reviewers et les facteurs, sur les plans personnel et temporel, qui pourraient affecter la qualité de telles inspections.Premiérement, nous relatons une étude quantitative dans laquelle nous utilisons l'algorithme SSZ pour détecter les modifications et les changements de code favorisant la création de bogues (bug-inducing changes) que nous avons lié avec l'information contenue dans les révisions de code (code review information) extraites du systéme de traçage des erreurs (issue tracking system). Nous avons découvert que les raisons pour lesquelles les réviseurs manquent certains bogues était corrélées autant à leurs caractéristiques personnelles qu'aux propriétés techniques des corrections en cours de revue. Ensuite, nous relatons une étude qualitative invitant les développeurs de chez Mozilla à nous donner leur opinion concernant les attributs favorables à la bonne formulation d'une révision de code. Les résultats de notre sondage suggèrent que les développeurs considèrent les aspects techniques (taille de la correction, nombre de chunks et de modules) autant que les caractéristiques personnelles (l'expérience et review queue) comme des facteurs influant fortement la qualité des revues de code.
Resumo:
Internet today has become a vital part of day to day life, owing to the revolutionary changes it has brought about in various fields. Dependence on the Internet as an information highway and knowledge bank is exponentially increasing so that a going back is beyond imagination. Transfer of critical information is also being carried out through the Internet. This widespread use of the Internet coupled with the tremendous growth in e-commerce and m-commerce has created a vital need for infonnation security.Internet has also become an active field of crackers and intruders. The whole development in this area can become null and void if fool-proof security of the data is not ensured without a chance of being adulterated. It is, hence a challenge before the professional community to develop systems to ensure security of the data sent through the Internet.Stream ciphers, hash functions and message authentication codes play vital roles in providing security services like confidentiality, integrity and authentication of the data sent through the Internet. There are several ·such popular and dependable techniques, which have been in use widely, for quite a long time. This long term exposure makes them vulnerable to successful or near successful attempts for attacks. Hence it is the need of the hour to develop new algorithms with better security.Hence studies were conducted on various types of algorithms being used in this area. Focus was given to identify the properties imparting security at this stage. By making use of a perception derived from these studies, new algorithms were designed. Performances of these algorithms were then studied followed by necessary modifications to yield an improved system consisting of a new stream cipher algorithm MAJE4, a new hash code JERIM- 320 and a new message authentication code MACJER-320. Detailed analysis and comparison with the existing popular schemes were also carried out to establish the security levels.The Secure Socket Layer (SSL) I Transport Layer Security (TLS) protocol is one of the most widely used security protocols in Internet. The cryptographic algorithms RC4 and HMAC have been in use for achieving security services like confidentiality and authentication in the SSL I TLS. But recent attacks on RC4 and HMAC have raised questions about the reliability of these algorithms. Hence MAJE4 and MACJER-320 have been proposed as substitutes for them. Detailed studies on the performance of these new algorithms were carried out; it has been observed that they are dependable alternatives.
Resumo:
Shrimp Aquaculture has provided tremendous opportunity for the economic and social upliftment of rural communities in the coastal areas of our country Over a hundred thousand farmers, of whom about 90% belong to the small and marginal category, are engaged in shrimp farming. Penaeus monodon is the most predominant cultured species in India which is mainly exported to highly sophisticated, quality and safety conscious world markets. Food safety has been of concem to humankind since the dawn of history and the concern about food safety resulted in the evolution of a cost effective, food safety assurance method, the Hazard Analysis Critical Control Point (HACCP). Considering the major contribution of cultured Penaeus monodon to the total shrimp production and the economic losses encountered due to disease outbreak and also because traditional methods of quality control and end point inspection cannot guarantee the safety of our cultured seafood products, it is essential that science based preventive approaches like HACCP and Pre requisite Programmes (PRP) be implemented in our shrimp farming operations. PRP is considered as a support system which provides a solid foundation for HACCP. The safety of postlarvae (PL) supplied for brackish water shrimp farming has also become an issue of concern over the past few years. The quality and safety of hatchery produced seeds have been deteriorating and disease outbreaks have become very common in hatcheries. It is in this context that the necessity for following strict quarantine measures with standards and code of practices becomes significant. Though there were a lot of hue and cry on the need for extending the focus of seafood safety assurance from processing and exporting to the pre-harvest and hatchery rearing phases, an experimental move in this direction has been rare or nil. An integrated management system only can assure the effective control of the quality, hygiene and safety related issues. This study therefore aims at designing a safety and quality management system model for implementation in shrimp farming and hatchery operations by linking the concepts of HACCP and PRP.
Resumo:
Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.
Resumo:
Biometrics deals with the physiological and behavioral characteristics of an individual to establish identity. Fingerprint based authentication is the most advanced biometric authentication technology. The minutiae based fingerprint identification method offer reasonable identification rate. The feature minutiae map consists of about 70-100 minutia points and matching accuracy is dropping down while the size of database is growing up. Hence it is inevitable to make the size of the fingerprint feature code to be as smaller as possible so that identification may be much easier. In this research, a novel global singularity based fingerprint representation is proposed. Fingerprint baseline, which is the line between distal and intermediate phalangeal joint line in the fingerprint, is taken as the reference line. A polygon is formed with the singularities and the fingerprint baseline. The feature vectors are the polygonal angle, sides, area, type and the ridge counts in between the singularities. 100% recognition rate is achieved in this method. The method is compared with the conventional minutiae based recognition method in terms of computation time, receiver operator characteristics (ROC) and the feature vector length. Speech is a behavioural biometric modality and can be used for identification of a speaker. In this work, MFCC of text dependant speeches are computed and clustered using k-means algorithm. A backpropagation based Artificial Neural Network is trained to identify the clustered speech code. The performance of the neural network classifier is compared with the VQ based Euclidean minimum classifier. Biometric systems that use a single modality are usually affected by problems like noisy sensor data, non-universality and/or lack of distinctiveness of the biometric trait, unacceptable error rates, and spoof attacks. Multifinger feature level fusion based fingerprint recognition is developed and the performances are measured in terms of the ROC curve. Score level fusion of fingerprint and speech based recognition system is done and 100% accuracy is achieved for a considerable range of matching threshold
Resumo:
Code clones are portions of source code which are similar to the original program code. The presence of code clones is considered as a bad feature of software as the maintenance of software becomes difficult due to the presence of code clones. Methods for code clone detection have gained immense significance in the last few years as they play a significant role in engineering applications such as analysis of program code, program understanding, plagiarism detection, error detection, code compaction and many more similar tasks. Despite of all these facts, several features of code clones if properly utilized can make software development process easier. In this work, we have pointed out such a feature of code clones which highlight the relevance of code clones in test sequence identification. Here program slicing is used in code clone detection. In addition, a classification of code clones is presented and the benefit of using program slicing in code clone detection is also mentioned in this work.
Resumo:
Genetic programming is known to provide good solutions for many problems like the evolution of network protocols and distributed algorithms. In such cases it is most likely a hardwired module of a design framework that assists the engineer to optimize specific aspects of the system to be developed. It provides its results in a fixed format through an internal interface. In this paper we show how the utility of genetic programming can be increased remarkably by isolating it as a component and integrating it into the model-driven software development process. Our genetic programming framework produces XMI-encoded UML models that can easily be loaded into widely available modeling tools which in turn posses code generation as well as additional analysis and test capabilities. We use the evolution of a distributed election algorithm as an example to illustrate how genetic programming can be combined with model-driven development. This example clearly illustrates the advantages of our approach – the generation of source code in different programming languages.