905 resultados para optimising compiler


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ao redigir a colectânea portuguesa de Ius proprium, na primeira metade do século XV, o compilador de el-rei D. Afonso V reservou um título exclusivo para a doação feita pelo marido à mulher e pela mulher ao marido. Trata-se de uma temática medieval em evidência na grande parte dos ordenamentos jurídicos do velho continente, inevitavelmente, condicionada pelo renascimento jurídico do Direito romano (séc. XII). Embora as Ordenações lusas dispensem a primazia alcançada pelo princípio justinianeu da proibição das doações entre cônjuges, não deixam de o ter em conta e enveredam por uma validade bastante condicionada, arraigada em fragmentos consuetudinários, castelhanos e do regime jurídico sucessório vigente no reino.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we explore optimising parameters of a physical circuit model relative to input/output measurements, using the Dallas Rangemaster Treble Booster as a case study. A hybrid metaheuristic/gradient descent algorithm is implemented, where the initial parameter sets for the optimisation are informed by nominal values from schematics and datasheets. Sensitivity analysis is used to screen parameters, which informs a study of the optimisation algorithm against model complexity by fixing parameters. The results of the optimisation show a significant increase in the accuracy of model behaviour, but also highlight several key issues regarding the recovery of parameters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

COSTA, Umberto Souza; MOREIRA, Anamaria Martins; MUSICANTE, Matin A.; SOUZA NETO, Plácido A. JCML: A specification language for the runtime verification of Java Card programs. Science of Computer Programming. [S.l]: [s.n], 2010.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

COSTA, Umberto Souza da; MOREIRA, Anamaria Martins; MUSICANTE, Martin A. Specification and Runtime Verification of Java Card Programs. Electronic Notes in Theoretical Computer Science. [S.l:s.n], 2009.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La description des termes dans les ressources terminologiques traditionnelles se limite à certaines informations, comme le terme (principalement nominal), sa définition et son équivalent dans une langue étrangère. Cette description donne rarement d’autres informations qui peuvent être très utiles pour l’utilisateur, surtout s’il consulte les ressources dans le but d’approfondir ses connaissances dans un domaine de spécialité, maitriser la rédaction professionnelle ou trouver des contextes où le terme recherché est réalisé. Les informations pouvant être utiles dans ce sens comprennent la description de la structure actancielle des termes, des contextes provenant de sources authentiques et l’inclusion d’autres parties du discours comme les verbes. Les verbes et les noms déverbaux, ou les unités terminologiques prédicatives (UTP), souvent ignorés par la terminologie classique, revêtent une grande importance lorsqu’il s’agit d’exprimer une action, un processus ou un évènement. Or, la description de ces unités nécessite un modèle de description terminologique qui rend compte de leurs particularités. Un certain nombre de terminologues (Condamines 1993, Mathieu-Colas 2002, Gross et Mathieu-Colas 2001 et L’Homme 2012, 2015) ont d’ailleurs proposé des modèles de description basés sur différents cadres théoriques. Notre recherche consiste à proposer une méthodologie de description terminologique des UTP de la langue arabe, notamment l’arabe standard moderne (ASM), selon la théorie de la Sémantique des cadres (Frame Semantics) de Fillmore (1976, 1977, 1982, 1985) et son application, le projet FrameNet (Ruppenhofer et al. 2010). Le domaine de spécialité qui nous intéresse est l’informatique. Dans notre recherche, nous nous appuyons sur un corpus recueilli du web et nous nous inspirons d’une ressource terminologique existante, le DiCoInfo (L’Homme 2008), pour compiler notre propre ressource. Nos objectifs se résument comme suit. Premièrement, nous souhaitons jeter les premières bases d’une version en ASM de cette ressource. Cette version a ses propres particularités : 1) nous visons des unités bien spécifiques, à savoir les UTP verbales et déverbales; 2) la méthodologie développée pour la compilation du DiCoInfo original devra être adaptée pour prendre en compte une langue sémitique. Par la suite, nous souhaitons créer une version en cadres de cette ressource, où nous regroupons les UTP dans des cadres sémantiques, en nous inspirant du modèle de FrameNet. À cette ressource, nous ajoutons les UTP anglaises et françaises, puisque cette partie du travail a une portée multilingue. La méthodologie consiste à extraire automatiquement les unités terminologiques verbales et nominales (UTV et UTN), comme Ham~ala (حمل) (télécharger) et taHmiyl (تحميل) (téléchargement). Pour ce faire, nous avons adapté un extracteur automatique existant, TermoStat (Drouin 2004). Ensuite, à l’aide des critères de validation terminologique (L’Homme 2004), nous validons le statut terminologique d’une partie des candidats. Après la validation, nous procédons à la création de fiches terminologiques, à l’aide d’un éditeur XML, pour chaque UTV et UTN retenue. Ces fiches comprennent certains éléments comme la structure actancielle des UTP et jusqu’à vingt contextes annotés. La dernière étape consiste à créer des cadres sémantiques à partir des UTP de l’ASM. Nous associons également des UTP anglaises et françaises en fonction des cadres créés. Cette association a mené à la création d’une ressource terminologique appelée « DiCoInfo : A Framed Version ». Dans cette ressource, les UTP qui partagent les mêmes propriétés sémantiques et structures actancielles sont regroupées dans des cadres sémantiques. Par exemple, le cadre sémantique Product_development regroupe des UTP comme Taw~ara (طور) (développer), to develop et développer. À la suite de ces étapes, nous avons obtenu un total de 106 UTP ASM compilées dans la version en ASM du DiCoInfo et 57 cadres sémantiques associés à ces unités dans la version en cadres du DiCoInfo. Notre recherche montre que l’ASM peut être décrite avec la méthodologie que nous avons mise au point.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ein Autobewerter für von Studierenden eingereichte Programme führt die im ProFormA-Aufgabenformat sequentiell spezifizierten "Tests" aus, um die Einreichung zu prüfen. Bzgl. der Interpretation und Darstellung der Testausführungsergebnisse gibt es derzeit keinen graderübergreifenden Standard. Wir beschreiben eine Erweiterung des ProFormA-Aufgabenformats um eine Hierarchie von Bewertungsaspekten, die nach didaktischen Aspekten gruppiert ist und Referenzen auf die Testausführungen besitzt. Die Erweiterung wurde in Graja umgesetzt, einem Autobewerter für Java-Programme. Je nach gewünschter Detailaufschlüsselung der Bewertungsaspekte müssen in der Konsequenz Testausführungen in Teilausführungen aufgebrochen werden. Wir illustrieren unseren Vorschlag unter Einsatz der Testwerkzeuge Compiler, dynamischer Softwaretest, statische Analyse sowie unter Einsatz menschlicher Bewerter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract—This paper presents PORBS, a parallelised observation-based slicing tool. The tool itself is written in Java making it platform independent and leverages the build chain of the system being sliced to avoid the need to replicate complex compiler analysis. The target audience of PORBS is software engineers and researchers working with and on tools and techniques for software comprehension, debugging, re-engineering, and maintenance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Total hip replacements (THRs) and total knee replacements (TKRs) are common elective procedures. In the REsearch STudies into the ORthopaedic Experience (RESTORE) programme, we explored the care and experiences of patients with osteoarthritis after being listed for THR and TKR up to the time when an optimal outcome should be expected. OBJECTIVE: To undertake a programme of research studies to work towards improving patient outcomes after THR and TKR. METHODS: We used methodologies appropriate to research questions: systematic reviews, qualitative studies, randomised controlled trials (RCTs), feasibility studies, cohort studies and a survey. Research was supported by patient and public involvement. RESULTS: Systematic review of longitudinal studies showed that moderate to severe long-term pain affects about 7–23% of patients after THR and 10–34% after TKR. In our cohort study, 10% of patients with hip replacement and 30% with knee replacement showed no clinically or statistically significant functional improvement. In our review of pain assessment few research studies used measures to capture the incidence, character and impact of long-term pain. Qualitative studies highlighted the importance of support by health and social professionals for patients at different stages of the joint replacement pathway. Our review of longitudinal studies suggested that patients with poorer psychological health, physical function or pain before surgery had poorer long-term outcomes and may benefit from pre-surgical interventions. However, uptake of a pre-operative pain management intervention was low. Although evidence relating to patient outcomes was limited, comorbidities are common and may lead to an increased risk of adverse events, suggesting the possible value of optimising pre-operative management. The evidence base on clinical effectiveness of pre-surgical interventions, occupational therapy and physiotherapy-based rehabilitation relied on small RCTs but suggested short-term benefit. Our feasibility studies showed that definitive trials of occupational therapy before surgery and post-discharge group-based physiotherapy exercise are feasible and acceptable to patients. Randomised trial results and systematic review suggest that patients with THR should receive local anaesthetic infiltration for the management of long-term pain, but in patients receiving TKR it may not provide additional benefit to femoral nerve block. From a NHS and Personal Social Services perspective, local anaesthetic infiltration was a cost-effective treatment in primary THR. In qualitative interviews, patients and health-care professionals recognised the importance of participating in the RCTs. To support future interventions and their evaluation, we conducted a study comparing outcome measures and analysed the RCTs as cohort studies. Analyses highlighted the importance of different methods in treating and assessing hip and knee osteoarthritis. There was an inverse association between radiographic severity of osteoarthritis and pain and function in patients waiting for TKR but no association in THR. Different pain characteristics predicted long-term pain in THR and TKR. Outcomes after joint replacement should be assessed with a patient-reported outcome and a functional test. CONCLUSIONS: The RESTORE programme provides important information to guide the development of interventions to improve long-term outcomes for patients with osteoarthritis receiving THR and TKR. Issues relating to their evaluation and the assessment of patient outcomes are highlighted. Potential interventions at key times in the patient pathway were identified and deserve further study, ultimately in the context of a complex intervention.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cette thèse s’inscrit dans le contexte d’une optimisation industrielle et économique des éléments de structure en BFUP permettant d’en garantir la ductilité au niveau structural, tout en ajustant la quantité de fibres et en optimisant le mode de fabrication. Le modèle développé décrit explicitement la participation du renfort fibré en traction au niveau local, en enchaînant une phase de comportement écrouissante suivie d’une phase adoucissante. La loi de comportement est fonction de la densité, de l’orientation des fibres vis-à-vis des directions principales de traction, de leur élancement et d’autres paramètres matériaux usuels liés aux fibres, à la matrice cimentaire et à leur interaction. L’orientation des fibres est prise en compte à partir d’une loi de probabilité normale à une ou deux variables permettant de reproduire n’importe quelle orientation obtenue à partir d’un calcul représentatif de la mise en oeuvre du BFUP frais ou renseignée par analyse expérimentale sur prototype. Enfin, le modèle reproduit la fissuration des BFUP sur le principe des modèles de fissures diffuses et tournantes. La loi de comportement est intégrée au sein d’un logiciel de calcul de structure par éléments finis, permettant de l’utiliser comme un outil prédictif de la fiabilité et de la ductilité globale d’éléments en BFUP. Deux campagnes expérimentales ont été effectuées, une à l’Université Laval de Québec et l’autre à l’Ifsttar, Marne-la-Vallée. La première permet de valider la capacité du modèle reproduire le comportement global sous des sollicitations typiques de traction et de flexion dans des éléments structurels simples pour lesquels l’orientation préférentielle des fibres a été renseignée par tomographie. La seconde campagne expérimentale démontre les capacités du modèle dans une démarche d’optimisation, pour la fabrication de plaques nervurées relativement complexes et présentant un intérêt industriel potentiel pour lesquels différentes modalités de fabrication et des BFUP plus ou moins fibrés ont été envisagés. Le contrôle de la répartition et de l’orientation des fibres a été réalisé à partir d’essais mécaniques sur prélèvements. Les prévisions du modèle ont été confrontées au comportement structurel global et à la ductilité mis en évidence expérimentalement. Le modèle a ainsi pu être qualifié vis-à-vis des méthodes analytiques usuelles de l’ingénierie, en prenant en compte la variabilité statistique. Des pistes d’amélioration et de complément de développement ont été identifiées.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Formulated food systems are becoming more sophisticated as demand grows for the design of structural and nutritional profiles targeted at increasingly specific demographics. Milk protein is an important bio- and techno-functional component of such formulations, which include infant formula, sports supplements, clinical beverages and elderly nutrition products. This thesis outlines research into ingredients that are key to the development of these products, namely milk protein concentrate (MPC), milk protein isolate (MPI), micellar casein concentrate (MCC), β-casein concentrate (BCC) and serum protein concentrate (SPC). MPC powders ranging from 37 to 90% protein (solids basis) were studied for properties of relevance to handling and storage of powders, powder solubilisation and thermal processing of reconstituted MPCs. MPC powders with ≥80% protein were found to have very poor flowability and high compressibility; in addition, these high-protein MPCs exhibited poor wetting and dispersion characteristics during rehydration in water. Heat stability studies on unconcentrated (3.5%, 140°C) and concentrated (8.5%, 120°C) MPC suspensions, showed that suspensions prepared from high-protein MPCs coagulated much more rapidly than lower protein MPCs. β-casein ingredients were developed using membrane processing. Enrichment of β-casein from skim milk was performed at laboratory-scale using ‘cold’ microfiltration (MF) at <4°C with either 1000 kDa molecular weight cut-off or 0.1 µm pore-size membranes. At pilot-scale, a second ‘warm’ MF step at 26°C was incorporated for selective purification of micellised β-casein from whey proteins; using this approach, BCCs with β-casein purity of up to 80% (protein basis) were prepared, with the whey protein purity of the SPC co-product reaching ~90%. The BCC ingredient could prevent supersaturated solutions of calcium phosphate (CaP) from precipitating, although the amorphous CaP formed created large micelles that were less thermo-reversible than those in CaP-free systems. Another co-product of BCC manufacture, MCC powder, was shown to have superior rehydration characteristics compared to traditional MCCs. The findings presented in this thesis constitute a significant advance in the research of milk protein ingredients, in terms of optimising their preparation by membrane filtration, preventing their destabilisation during processing and facilitating their effective incorporation into nutritional formulations designed for consumers of a specific age, lifestyle or health status

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is now clear that the concept of a HPC compiler which automatically produces highly efficient parallel implementations is a pipe-dream. Another route is to recognise from the outset that user information is required and to develop tools that embed user interaction in the transformation of code from scalar to parallel form, and then use conventional compilers with a set of communication calls. This represents the key idea underlying the development of the CAPTools software environment. The initial version of CAPTools is focused upon single block structured mesh computational mechanics codes. The capability for unstructured mesh codes is under test now and block structured meshes will be included next. The parallelisation process can be completed rapidly for modest codes and the parallel performance approaches that which is delivered by hand parallelisations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent advances in the massively parallel computational abilities of graphical processing units (GPUs) have increased their use for general purpose computation, as companies look to take advantage of big data processing techniques. This has given rise to the potential for malicious software targeting GPUs, which is of interest to forensic investigators examining the operation of software. The ability to carry out reverse-engineering of software is of great importance within the security and forensics elds, particularly when investigating malicious software or carrying out forensic analysis following a successful security breach. Due to the complexity of the Nvidia CUDA (Compute Uni ed Device Architecture) framework, it is not clear how best to approach the reverse engineering of a piece of CUDA software. We carry out a review of the di erent binary output formats which may be encountered from the CUDA compiler, and their implications on reverse engineering. We then demonstrate the process of carrying out disassembly of an example CUDA application, to establish the various techniques available to forensic investigators carrying out black-box disassembly and reverse engineering of CUDA binaries. We show that the Nvidia compiler, using default settings, leaks useful information. Finally, we demonstrate techniques to better protect intellectual property in CUDA algorithm implementations from reverse engineering.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A method is outlined for optimising graph partitions which arise in mapping un- structured mesh calculations to parallel computers. The method employs a combination of iterative techniques to both evenly balance the workload and minimise the number and volume of interprocessor communications. They are designed to work efficiently in parallel as well as sequentially and when combined with a fast direct partitioning technique (such as the Greedy algorithm) to give an initial partition, the resulting two-stage process proves itself to be both a powerful and flexible solution to the static graph-partitioning problem. The algorithms can also be used for dynamic load-balancing and a clustering technique can additionally be employed to speed up the whole process. Experiments indicate that the resulting parallel code can provide high quality partitions, independent of the initial partition, within a few seconds.