961 resultados para Code-centric development
Resumo:
This paper presents innovative work in the development of policy-based autonomic computing. The core of the work is a powerful and flexible policy-expression language AGILE, which facilitates run-time adaptable policy configuration of autonomic systems. AGILE also serves as an integrating platform for other self-management technologies including signal processing, automated trend analysis and utility functions. Each of these technologies has specific advantages and applicability to different types of dynamic adaptation. The AGILE platform enables seamless interoperability of the different technologies to each perform various aspects of self-management within a single application. The various technologies are implemented as object components. Self-management behaviour is specified using the policy language semantics to bind the various components together as required. Since the policy semantics support run-time re-configuration, the self-management architecture is dynamically composable. Additional benefits include the standardisation of the application programmer interface, terminology and semantics, and only a single point of embedding is required.
Resumo:
This paper investigates the profile of teachers in the island of Ireland who declared themselves willing to undertake professional development activities in programming, in particular to master programming by taking on-line courses involving the design of computer games. Using the Technology Acceptance Model (TAM), it compares scores for teachers “willing” to undertake the courses with scores for those who declined, and examines other differences between the groups of respondents. Findings reflect the perceived difficulties of programming and the current low status accorded to the subject in Ireland. The paper also reviews the use of games-based learning as a “hook” to engage learners in programming and discusses the role of gamification as a tool for motivating learners in an on-line course. The on-line course focusing on games design was met with enthusiasm, and there was general consensus that gamification was appropriate for motivating learners in structured courses such as those provided.
Resumo:
During extreme sea states so called impact events can be observed on the wave energy converter Oyster. In small scale experimental tests these impact events cause high frequency signals in the measured load which decrease confidence in the data obtained. These loads depend on the structural dynamics of the model. Amplification of the loads can occur and is transferred through the structure from the point of impact to the load cell located in the foundation. Since the determination of design data and load cases for Wave Energy Converters originate from scale experiments, this lack of confidence has a direct effect on the development.
Numerical vibration analysis is a valuable tool in the research of the structural load response of Oyster to impact events, but must take into account the effect of the surrounding water. This can be done efficiently by adding an added mass distribution, computed with a linearised potential boundary element method. This paper presents the development and validation of a numerical procedure, which couples the OpenSource boundary element code NEMOH with the Finite Element Analysis tool CodeAster. Numerical results of the natural frequencies and mode shapes of the structure under the influence of added mass due to specific structural modes are compared with experimental results.
Laser-driven x-ray and neutron source development for industrial applications of plasma accelerators
Resumo:
Pulsed beams of energetic x-rays and neutrons from intense laser interactions with solid foils are promising for applications where bright, small emission area sources, capable of multi-modal delivery are ideal. Possible end users of laser-driven multi-modal sources are those requiring advanced non-destructive inspection techniques in industry sectors of high value commerce such as aerospace, nuclear and advanced manufacturing. We report on experimental work that demonstrates multi-modal operation of high power laser-solid interactions for neutron and x-ray beam generation. Measurements and Monte Carlo radiation transport simulations show that neutron yield is increased by a factor ∼2 when a 1 mm copper foil is placed behind a 2 mm lithium foil, compared to using a 2 cm block of lithium only. We explore x-ray generation with a 10 picosecond drive pulse in order to tailor the spectral content for radiography with medium density alloy metals. The impact of using >1 ps pulse duration on laser-accelerated electron beam generation and transport is discussed alongside the optimisation of subsequent bremsstrahlung emission in thin, high atomic number target foils. X-ray spectra are deconvolved from spectrometer measurements and simulation data generated using the GEANT4 Monte Carlo code. We also demonstrate the unique capability of laser-driven x-rays in being able to deliver single pulse high spatial resolution projection imaging of thick metallic objects. Active detector radiographic imaging of industrially relevant sample objects with a 10 ps drive pulse is presented for the first time, demonstrating that features of 200 μm size are resolved when projected at high magnification.
Resumo:
The artefact and techno-centricity of the research into the architecture process needs to be counterbalanced by other approaches. An increasing amount of information is collected and used in the process, resulting in challenges related to information and knowledge management, as this research evidences through interviews with practicing architects. However, emerging technologies are expected to resolve many of the traditional challenges, opening up new avenues for research. This research suggests that among them novel techniques addressing how architects interact with project information, especially that indirectly related to the artefacts, and tools which better address the social nature of work, notably communication between participants, become a higher priority. In the fields associated with the Human Computer Interaction generic solutions still frequently prevail, whereas it appears that specific alternative approaches would be particularly in demand for the dynamic and context dependent design process. This research identifies an opportunity for a process-centric and integrative approach for architectural practice and proposes an information management and communication software application, developed for the needs discovered in close collaboration with architects. Departing from the architects’ challenges, an information management software application, Mneme, was designed and developed until a working prototype. It proposes the use of visualizations as an interface to provide an overview of the process, facilitate project information retrieval and access, and visualize relationships between the pieces of information. Challenges with communication about visual content, such as images and 3D files, led to a development of a communication feature allowing discussions attached to any file format and searchable from a database. Based on the architects testing the prototype and literature recognizing the subjective side of usability, this thesis argues that visualizations, even 3D visualizations, present potential as an interface for information management in the architecture process. The architects confirmed that Mneme allowed them to have a better project overview, to easier locate heterogeneous content, and provided context for the project information. Communication feature in Mneme was seen to offer a lot of potential in design projects where diverse file formats are typically used. Through empirical understanding of the challenges in the architecture process, and through testing the resulting software proposal, this thesis suggests promising directions for future research into the architecture and design process.
Resumo:
Compreender a funcionalidade de uma criança é um desafio persistente em contextos de saúde e educação. Na tentativa de superar esse desafio, em 2007, a Organização Mundial de Saúde desenvolveu a Classificação Internacional de Funcionalidade, Incapacidade e Saúde para Crianças e Jovens (CIF-CJ) como o primeiro sistema de classificação universal para documentar a saúde e funcionalidade da criança. Apesar de a CIF-CJ não ser um instrumento de avaliação e intervenção, tem, no entanto, a capacidade de servir de enquadramento para o desenvolvimento de ferramentas adaptadas às necessidades dos seus utilizadores. Considerando que no contexto escolar, a escrita manual encontra-se entre as atividades mais requeridas para a participação plena de uma criança, parece ser pertinente a definição de um conjunto de códigos destinados a caracterizar o perfil de funcionalidade de uma criança, no que se refere à escrita manual. O objetivo deste estudo foi, pois, o desenvolvimento de um conjunto preliminar de códigos baseado na CIF-CJ que possa vir a constituir um code set para a escrita manual. Dada a complexidade do tema e atendendo a que se pretende alcançar consenso entre os especialistas sobre quais as categorias da CIF-CJ que devem ser consideradas, optou-se pela utilização da técnica de Delphi. A escolha da metodologia seguiu a orientação dos procedimentos adotados pelo projeto Core Set CIF. De dezoito profissionais contactados, obtiveram-se respostas de sete terapeutas ocupacionais com experiência em pediatria, que participaram em todas as rondas. No total, três rondas de questionários foram realizadas para atingir um consenso, com um nível de concordância, previamente definido, de 70%. Deste estudo resultou um conjunto preliminar de códigos com 54 categorias da CIF-CJ (16 categorias de segundo nível, 14 categorias de terceiro nível e uma categoria de quarto nível), das quais 31 são categorias das funções do corpo, uma categoria das estruturas do corpo, 12 categorias de atividades e participação e 10 categorias de fatores ambientais. Este estudo é um primeiro passo para o desenvolvimento de um code set para a escrita manual baseado na CIF-CJ , sendo claramente necessário a realização de mais pesquisas no contexto do desenvolvimento e da validação deste code set.
Resumo:
The design and development of the swordfish autonomous surface vehicle (ASV) system is discussed. Swordfish is an ocean capable 4.5 m long catamaran designed for network centric operations (with ocean and air going vehicles and human operators). In the basic configuration, Swordfish is both a survey vehicle and a communications node with gateways for broadband, Wi-Fi and GSM transports and underwater acoustic modems. In another configuration, Swordfish mounts a docking station for the autonomous underwater vehicle Isurus from Porto University. Swordfish has an advanced control architecture for multi-vehicle operations with mixed initiative interactions (human operators are allowed to interact with the control loops).
Resumo:
The tumour necrosis factor (TNF) family members B cell activating factor (BAFF) and APRIL (a proliferation-inducing ligand) are crucial survival factors for peripheral B cells. An excess of BAFF leads to the development of autoimmune disorders in animal models, and high levels of BAFF have been detected in the serum of patients with various autoimmune conditions. In this Review, we consider the possibility that in mice autoimmunity induced by BAFF is linked to T cell-independent B cell activation rather than to a severe breakdown of B cell tolerance. We also outline the mechanisms of BAFF signalling, the impact of ligand oligomerization on receptor activation and the progress of BAFF-depleting agents in the clinical setting.
Resumo:
Raised blood pressure (BP) is a major risk factor for cardiovascular disease. Previous studies have identified 47 distinct genetic variants robustly associated with BP, but collectively these explain only a few percent of the heritability for BP phenotypes. To find additional BP loci, we used a bespoke gene-centric array to genotype an independent discovery sample of 25,118 individuals that combined hypertensive case-control and general population samples. We followed up four SNPs associated with BP at our p < 8.56 × 10(-7) study-specific significance threshold and six suggestively associated SNPs in a further 59,349 individuals. We identified and replicated a SNP at LSP1/TNNT3, a SNP at MTHFR-NPPB independent (r(2) = 0.33) of previous reports, and replicated SNPs at AGT and ATP2B1 reported previously. An analysis of combined discovery and follow-up data identified SNPs significantly associated with BP at p < 8.56 × 10(-7) at four further loci (NPR3, HFE, NOS3, and SOX6). The high number of discoveries made with modest genotyping effort can be attributed to using a large-scale yet targeted genotyping array and to the development of a weighting scheme that maximized power when meta-analyzing results from samples ascertained with extreme phenotypes, in combination with results from nonascertained or population samples. Chromatin immunoprecipitation and transcript expression data highlight potential gene regulatory mechanisms at the MTHFR and NOS3 loci. These results provide candidates for further study to help dissect mechanisms affecting BP and highlight the utility of studying SNPs and samples that are independent of those studied previously even when the sample size is smaller than that in previous studies.
Resumo:
The use of certain perfonnance enhancing substances and methods has been defined as a major ethical breach by parties involved in the governance of highperfonnance sport. As a result, elite athletes worldwide are subject to rules and regulations set out in international and national anti-doping policies. Existing literature on the development of policies such as the World Anti-Doping Code and The Canadian antiDoping Program suggests a sport system in which athletes are rarely meaningfully involved in policy development (Houlihan, 2004a). Additionally, it is suggested that this lack of involvement is reflective of a similar lack of involvement in other areas of governance concerning athletes' lives. The purpose ofthis thesis is to examine the history and current state of athletes' involvement in the anti-doping policy process in Canada's high-perfonnance sport system. It includes discussion and analysis of recently conducted interviews with those involved in the policy process as well as an analysis of relevant documents, including anti-doping policies. The findings demonstrate that Canadian athletes have not been significantly involved in the creation of recently developed antidoping policies and that a re-evaluation of current policies is necessary to more fully recognize the reality of athletes' lives in Canada's high-perfonnance sport system and their rights within that system.
Resumo:
La famille des gènes Hox code pour des facteurs de transcription connus pour leur contribution essentielle à l’élaboration de l’architecture du corps et ce, au sein de tout le règne animal. Au cours de l’évolution chez les vertébrés, les gènes Hox ont été redéfinis pour générer toute une variété de nouveaux tissus/organes. Souvent, cette diversification s’est effectuée via des changements quant au contrôle transcriptionnel des gènes Hox. Chez les mammifères, la fonction de Hoxa13 n’est pas restreinte qu’à l’embryon même, mais s’avère également essentielle pour le développement de la vascularisation fœtale au sein du labyrinthe placentaire, suggérant ainsi que sa fonction au sein de cette structure aurait accompagné l’émergence des espèces placentaires. Au chapitre 2, nous mettons en lumière le recrutement de deux autres gènes Hoxa, soient Hoxa10 et Hoxa11, au compartiment extra-embryonnaire. Nous démontrons que l’expression de Hoxa10, Hoxa11 et Hoxa13 est requise au sein de l’allantoïde, précurseur du cordon ombilical et du système vasculaire fœtal au sein du labyrinthe placentaire. De façon intéressante, nous avons découvert que l’expression des gènes Hoxa10-13 dans l’allantoïde n’est pas restreinte qu’aux mammifères placentaires, mais est également présente chez un vertébré non-placentaire, indiquant que le recrutement des ces gènes dans l’allantoïde précède fort probablement l’émergence des espèces placentaires. Nous avons généré des réarrangements génétiques et utilisé des essais transgéniques pour étudier les mécanismes régulant l’expression des gènes Hoxa dans l’allantoïde. Nous avons identifié un fragment intergénique de 50 kb capable d’induire l’expression d’un gène rapporteur dans l’allantoïde. Cependant, nous avons trouvé que le mécanisme de régulation contrôlant l’expression du gène Hoxa au sein du compartiment extra-embryonnaire est fort complexe et repose sur plus qu’un seul élément cis-régulateur. Au chapitre 3, nous avons utilisé la cartographie génétique du destin cellulaire pour évaluer la contribution globale des cellules exprimant Hoxa13 aux différentes structures embryonnaires. Plus particulièrement, nous avons examiné plus en détail l’analyse de la cartographie du destin cellulaire de Hoxa13 dans les pattes antérieures en développement. Nous avons pu déterminer que, dans le squelette du membre, tous les éléments squelettiques de l’autopode (main), à l’exception de quelques cellules dans les éléments carpiens les plus proximaux, proviennent des cellules exprimant Hoxa13. En contraste, nous avons découvert que, au sein du compartiment musculaire, les cellules exprimant Hoxa13 et leurs descendantes (Hoxa13lin+) s’étendent à des domaines plus proximaux du membre, où ils contribuent à générer la plupart des masses musculaires de l’avant-bras et, en partie, du triceps. De façon intéressante, nous avons découvert que les cellules exprimant Hoxa13 et leurs descendantes ne sont pas distribuées uniformément parmi les différents muscles. Au sein d’une même masse musculaire, les fibres avec une contribution Hoxa13lin+ différente peuvent être identifiées et les fibres avec une contribution semblable sont souvent regroupées ensemble. Ce résultat évoque la possibilité que Hoxa13 soit impliqué dans la mise en place de caractéristiques spécifiques des groupes musculaires, ou la mise en place de connections nerf-muscle. Prises dans leur ensemble, les données ici présentées permettent de mieux comprendre le rôle de Hoxa13 au sein des compartiments embryonnaires et extra-embryonnaires. Par ailleurs, nos résultats seront d’une importance primordiale pour soutenir les futures études visant à expliquer les mécanismes transcriptionnels soutenant la régulation des gènes Hoxa dans les tissus extra-embryonnaires.
Resumo:
Il existe actuellement de nombreuses preuves démontrant que des facteurs génétiques et environnementaux interagissent pendant des périodes spécifiques du développement pour rendre une personne vulnérable aux troubles psychologiques via diverses adaptations physiologiques. Cette thèse porte sur l'impact de l’adversité prénatale (représentée par le petit poids à la naissance, PPN) et de l’adversité postnatale précoce (symptômes dépressifs maternels et comportements maternels négatifs), sur le développement du cerveau, particulièrement les régions fronto-limbiques impliquées dans le traitement des émotions, pendant l'enfance et l'adolescence. Des jumeaux monozygotes (MZ) sont utilisés, lorsque possible, afin de contrôler pour les effets génétiques. Les chapitres 1 et 2 présentent les résultats de la vérification de l'hypothèse que l’adversité prénatale et postnatale précoce sont associées à une altération du fonctionnement des régions fronto-limbique tels que l’amygdale, l’hippocampe, l’insula, le cortex cingulaire antérieur et le cortex préfrontal, en réponse à des stimuli émotifs chez des enfants et des adolescents. On observe que les symptômes dépressifs maternels sont associés à une activation plus élevée des régions fronto-limbiques des enfants en réponse à la tristesse. Les résultats de l’étude avec des adolescents suggèrent que le PPN, les symptômes dépressifs et les comportements maternels négatifs sont associés à une fonction altérée des régions fronto-limbiques en réponse à des stimuli émotionnels. Chez les jumeaux MZ on observe également que la discordance intra-paire de PPN et de certains comportements maternels est associée à une discordance intra-paire du fonctionnement du cerveau et que ces altérations diffèrent selon le sexe. Le chapitre 3 présente les résultats de la vérification de l'hypothèse que l’adversité prénatale et postnatale précoce sont associées à un volume total réduit du cerveau et de l’hypothèse que les comportements maternels peuvent servir de médiateur ou de modérateur de l'association entre le PPN et le volume du cerveau. Avec des jumeaux MZ à l’adolescence on observe a) que le PPN est effectivement associé à une diminution du volume total du cerveau et b) que la discordance intra-paire de PPN est associée à une discordance du volume du cerveau. En somme, cette thèse présente un ensemble de résultats qui soutiennent deux hypothèses importantes pour comprendre les effets de l’environnement sur le développement du cerveau : que l’environnement prénatal et postnatal précoce ont un impact sur le développement du cerveau indépendamment du code génétique et que les mécanismes impliqués peuvent différer entre les garçons et les filles. Finalement, l’ensemble de ces résultats sont discutés à la lumière des autres travaux de recherche dans ce domaine et des avenues à explorer pour de la recherche ultérieure sont proposées.
Resumo:
La révision du code est un procédé essentiel quelque soit la maturité d'un projet; elle cherche à évaluer la contribution apportée par le code soumis par les développeurs. En principe, la révision du code améliore la qualité des changements de code (patches) avant qu'ils ne soient validés dans le repertoire maître du projet. En pratique, l'exécution de ce procédé n'exclu pas la possibilité que certains bugs passent inaperçus. Dans ce document, nous présentons une étude empirique enquétant la révision du code d'un grand projet open source. Nous investissons les relations entre les inspections des reviewers et les facteurs, sur les plans personnel et temporel, qui pourraient affecter la qualité de telles inspections.Premiérement, nous relatons une étude quantitative dans laquelle nous utilisons l'algorithme SSZ pour détecter les modifications et les changements de code favorisant la création de bogues (bug-inducing changes) que nous avons lié avec l'information contenue dans les révisions de code (code review information) extraites du systéme de traçage des erreurs (issue tracking system). Nous avons découvert que les raisons pour lesquelles les réviseurs manquent certains bogues était corrélées autant à leurs caractéristiques personnelles qu'aux propriétés techniques des corrections en cours de revue. Ensuite, nous relatons une étude qualitative invitant les développeurs de chez Mozilla à nous donner leur opinion concernant les attributs favorables à la bonne formulation d'une révision de code. Les résultats de notre sondage suggèrent que les développeurs considèrent les aspects techniques (taille de la correction, nombre de chunks et de modules) autant que les caractéristiques personnelles (l'expérience et review queue) comme des facteurs influant fortement la qualité des revues de code.
Resumo:
Internet today has become a vital part of day to day life, owing to the revolutionary changes it has brought about in various fields. Dependence on the Internet as an information highway and knowledge bank is exponentially increasing so that a going back is beyond imagination. Transfer of critical information is also being carried out through the Internet. This widespread use of the Internet coupled with the tremendous growth in e-commerce and m-commerce has created a vital need for infonnation security.Internet has also become an active field of crackers and intruders. The whole development in this area can become null and void if fool-proof security of the data is not ensured without a chance of being adulterated. It is, hence a challenge before the professional community to develop systems to ensure security of the data sent through the Internet.Stream ciphers, hash functions and message authentication codes play vital roles in providing security services like confidentiality, integrity and authentication of the data sent through the Internet. There are several ·such popular and dependable techniques, which have been in use widely, for quite a long time. This long term exposure makes them vulnerable to successful or near successful attempts for attacks. Hence it is the need of the hour to develop new algorithms with better security.Hence studies were conducted on various types of algorithms being used in this area. Focus was given to identify the properties imparting security at this stage. By making use of a perception derived from these studies, new algorithms were designed. Performances of these algorithms were then studied followed by necessary modifications to yield an improved system consisting of a new stream cipher algorithm MAJE4, a new hash code JERIM- 320 and a new message authentication code MACJER-320. Detailed analysis and comparison with the existing popular schemes were also carried out to establish the security levels.The Secure Socket Layer (SSL) I Transport Layer Security (TLS) protocol is one of the most widely used security protocols in Internet. The cryptographic algorithms RC4 and HMAC have been in use for achieving security services like confidentiality and authentication in the SSL I TLS. But recent attacks on RC4 and HMAC have raised questions about the reliability of these algorithms. Hence MAJE4 and MACJER-320 have been proposed as substitutes for them. Detailed studies on the performance of these new algorithms were carried out; it has been observed that they are dependable alternatives.
Resumo:
Shrimp Aquaculture has provided tremendous opportunity for the economic and social upliftment of rural communities in the coastal areas of our country Over a hundred thousand farmers, of whom about 90% belong to the small and marginal category, are engaged in shrimp farming. Penaeus monodon is the most predominant cultured species in India which is mainly exported to highly sophisticated, quality and safety conscious world markets. Food safety has been of concem to humankind since the dawn of history and the concern about food safety resulted in the evolution of a cost effective, food safety assurance method, the Hazard Analysis Critical Control Point (HACCP). Considering the major contribution of cultured Penaeus monodon to the total shrimp production and the economic losses encountered due to disease outbreak and also because traditional methods of quality control and end point inspection cannot guarantee the safety of our cultured seafood products, it is essential that science based preventive approaches like HACCP and Pre requisite Programmes (PRP) be implemented in our shrimp farming operations. PRP is considered as a support system which provides a solid foundation for HACCP. The safety of postlarvae (PL) supplied for brackish water shrimp farming has also become an issue of concern over the past few years. The quality and safety of hatchery produced seeds have been deteriorating and disease outbreaks have become very common in hatcheries. It is in this context that the necessity for following strict quarantine measures with standards and code of practices becomes significant. Though there were a lot of hue and cry on the need for extending the focus of seafood safety assurance from processing and exporting to the pre-harvest and hatchery rearing phases, an experimental move in this direction has been rare or nil. An integrated management system only can assure the effective control of the quality, hygiene and safety related issues. This study therefore aims at designing a safety and quality management system model for implementation in shrimp farming and hatchery operations by linking the concepts of HACCP and PRP.