943 resultados para software analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Methane hydrate is an ice-like substance that is stable at high-pressure and low temperature in continental margin sediments. Since the discovery of a large number of gas flares at the landward termination of the gas hydrate stability zone off Svalbard, there has been concern that warming bottom waters have started to dissociate large amounts of gas hydrate and that the resulting methane release may possibly accelerate global warming. Here, we can corroborate that hydrates play a role in the observed seepage of gas, but we present evidence that seepage off Svalbard has been ongoing for at least three thousand years and that seasonal fluctuations of 1-2°C in the bottom-water temperature cause periodic gas hydrate formation and dissociation, which focus seepage at the observed sites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern software applications are becoming more dependent on database management systems (DBMSs). DBMSs are usually used as black boxes by software developers. For example, Object-Relational Mapping (ORM) is one of the most popular database abstraction approaches that developers use nowadays. Using ORM, objects in Object-Oriented languages are mapped to records in the database, and object manipulations are automatically translated to SQL queries. As a result of such conceptual abstraction, developers do not need deep knowledge of databases; however, all too often this abstraction leads to inefficient and incorrect database access code. Thus, this thesis proposes a series of approaches to improve the performance of database-centric software applications that are implemented using ORM. Our approaches focus on troubleshooting and detecting inefficient (i.e., performance problems) database accesses in the source code, and we rank the detected problems based on their severity. We first conduct an empirical study on the maintenance of ORM code in both open source and industrial applications. We find that ORM performance-related configurations are rarely tuned in practice, and there is a need for tools that can help improve/tune the performance of ORM-based applications. Thus, we propose approaches along two dimensions to help developers improve the performance of ORM-based applications: 1) helping developers write more performant ORM code; and 2) helping developers configure ORM configurations. To provide tooling support to developers, we first propose static analysis approaches to detect performance anti-patterns in the source code. We automatically rank the detected anti-pattern instances according to their performance impacts. Our study finds that by resolving the detected anti-patterns, the application performance can be improved by 34% on average. We then discuss our experience and lessons learned when integrating our anti-pattern detection tool into industrial practice. We hope our experience can help improve the industrial adoption of future research tools. However, as static analysis approaches are prone to false positives and lack runtime information, we also propose dynamic analysis approaches to further help developers improve the performance of their database access code. We propose automated approaches to detect redundant data access anti-patterns in the database access code, and our study finds that resolving such redundant data access anti-patterns can improve application performance by an average of 17%. Finally, we propose an automated approach to tune performance-related ORM configurations using both static and dynamic analysis. Our study shows that our approach can help improve application throughput by 27--138%. Through our case studies on real-world applications, we show that all of our proposed approaches can provide valuable support to developers and help improve application performance significantly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This deliverable summarizes, validates and explains the purpose and concept behind the RAGE knowledge and innovation management platform as a self-sustainable Ecosystem, supporting innovation processes in the Applied Gaming (AG) industry. The Ecosystem portal will be developed with particular consideration of the demand and requirements of small and medium sized game developing companies, education providers and related stakeholders like AG researchers and AG end-users. The innovation potential of the new platform underlies the following factors: a huge, mostly entire collection of community specific knowledge (e.g., content like media objects, software components and best practices), a structured approach of knowledge access, search and browse, collaboration tools as well as social network analysis tools to foster efficient knowledge creation and transformation processes into marketable technology assets. The deliverable provides an overview of the current status and the remaining work to come, preceding the final version in month 48 of the RAGE project.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The large upfront investments required for game development pose a severe barrier for the wider uptake of serious games in education and training. Also, there is a lack of well-established methods and tools that support game developers at preserving and enhancing the games’ pedagogical effectiveness. The RAGE project, which is a Horizon 2020 funded research project on serious games, addresses these issues by making available reusable software components that aim to support the pedagogical qualities of serious games. In order to easily deploy and integrate these game components in a multitude of game engines, platforms and programming languages, RAGE has developed and validated a hybrid component-based software architecture that preserves component portability and interoperability. While a first set of software components is being developed, this paper presents selected examples to explain the overall system’s concept and its practical benefits. First, the Emotion Detection component uses the learners’ webcams for capturing their emotional states from facial expressions. Second, the Performance Statistics component is an add-on for learning analytics data processing, which allows instructors to track and inspect learners’ progress without bothering about the required statistics computations. Third, a set of language processing components accommodate the analysis of textual inputs of learners, facilitating comprehension assessment and prediction. Fourth, the Shared Data Storage component provides a technical solution for data storage - e.g. for player data or game world data - across multiple software components. The presented components are exemplary for the anticipated RAGE library, which will include up to forty reusable software components for serious gaming, addressing diverse pedagogical dimensions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background The use of simulation in medical education is increasing, with students taught and assessed using simulated patients and manikins. Medical students at Queen’s University of Belfast are taught advanced life support cardiopulmonary resuscitation as part of the undergraduate curriculum. Teaching and feedback in these skills have been developed in Queen’s University with high-fidelity manikins. This study aimed to evaluate the effectiveness of video compared to verbal feedback in assessment of student cardiopulmonary resuscitation performance Methods Final year students participated in this study using a high-fidelity manikin, in the Clinical Skills Centre, Queen’s University Belfast. Cohort A received verbal feedback only on their performance and cohort B received video feedback only. Video analysis using ‘StudioCode’ software was distributed to students. Each group returned for a second scenario and evaluation 4 weeks later. An assessment tool was created for performance assessment, which included individual skill and global score evaluation. Results One hundred thirty eight final year medical students completed the study. 62 % were female and the mean age was 23.9 years. Students having video feedback had significantly greater improvement in overall scores compared to those receiving verbal feedback (p = 0.006, 95 % CI: 2.8–15.8). Individual skills, including ventilation quality and global score were significantly better with video feedback (p = 0.002 and p < 0.001, respectively) when compared with cohort A. There was a positive change in overall score for cohort B from session one to session two (p < 0.001, 95 % CI: 6.3–15.8) indicating video feedback significantly benefited skill retention. In addition, using video feedback showed a significant improvement in the global score (p < 0.001, 95 % CI: 3.3–7.2) and drug administration timing (p = 0.004, 95 % CI: 0.7–3.8) of cohort B participants, from session one to session two. Conclusions There is increased use of simulation in medicine but a paucity of published data comparing feedback methods in cardiopulmonary resuscitation training. Our study shows the use of video feedback when teaching cardiopulmonary resuscitation is more effective than verbal feedback, and enhances skill retention. This is one of the first studies to demonstrate the benefit of video feedback in cardiopulmonary resuscitation teaching.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La spectrométrie de masse mesure la masse des ions selon leur rapport masse sur charge. Cette technique est employée dans plusieurs domaines et peut analyser des mélanges complexes. L’imagerie par spectrométrie de masse (Imaging Mass Spectrometry en anglais, IMS), une branche de la spectrométrie de masse, permet l’analyse des ions sur une surface, tout en conservant l’organisation spatiale des ions détectés. Jusqu’à présent, les échantillons les plus étudiés en IMS sont des sections tissulaires végétales ou animales. Parmi les molécules couramment analysées par l’IMS, les lipides ont suscité beaucoup d'intérêt. Les lipides sont impliqués dans les maladies et le fonctionnement normal des cellules; ils forment la membrane cellulaire et ont plusieurs rôles, comme celui de réguler des événements cellulaires. Considérant l’implication des lipides dans la biologie et la capacité du MALDI IMS à les analyser, nous avons développé des stratégies analytiques pour la manipulation des échantillons et l’analyse de larges ensembles de données lipidiques. La dégradation des lipides est très importante dans l’industrie alimentaire. De la même façon, les lipides des sections tissulaires risquent de se dégrader. Leurs produits de dégradation peuvent donc introduire des artefacts dans l’analyse IMS ainsi que la perte d’espèces lipidiques pouvant nuire à la précision des mesures d’abondance. Puisque les lipides oxydés sont aussi des médiateurs importants dans le développement de plusieurs maladies, leur réelle préservation devient donc critique. Dans les études multi-institutionnelles où les échantillons sont souvent transportés d’un emplacement à l’autre, des protocoles adaptés et validés, et des mesures de dégradation sont nécessaires. Nos principaux résultats sont les suivants : un accroissement en fonction du temps des phospholipides oxydés et des lysophospholipides dans des conditions ambiantes, une diminution de la présence des lipides ayant des acides gras insaturés et un effet inhibitoire sur ses phénomènes de la conservation des sections au froid sous N2. A température et atmosphère ambiantes, les phospholipides sont oxydés sur une échelle de temps typique d’une préparation IMS normale (~30 minutes). Les phospholipides sont aussi décomposés en lysophospholipides sur une échelle de temps de plusieurs jours. La validation d’une méthode de manipulation d’échantillon est d’autant plus importante lorsqu’il s’agit d’analyser un plus grand nombre d’échantillons. L’athérosclérose est une maladie cardiovasculaire induite par l’accumulation de matériel cellulaire sur la paroi artérielle. Puisque l’athérosclérose est un phénomène en trois dimension (3D), l'IMS 3D en série devient donc utile, d'une part, car elle a la capacité à localiser les molécules sur la longueur totale d’une plaque athéromateuse et, d'autre part, car elle peut identifier des mécanismes moléculaires du développement ou de la rupture des plaques. l'IMS 3D en série fait face à certains défis spécifiques, dont beaucoup se rapportent simplement à la reconstruction en 3D et à l’interprétation de la reconstruction moléculaire en temps réel. En tenant compte de ces objectifs et en utilisant l’IMS des lipides pour l’étude des plaques d’athérosclérose d’une carotide humaine et d’un modèle murin d’athérosclérose, nous avons élaboré des méthodes «open-source» pour la reconstruction des données de l’IMS en 3D. Notre méthodologie fournit un moyen d’obtenir des visualisations de haute qualité et démontre une stratégie pour l’interprétation rapide des données de l’IMS 3D par la segmentation multivariée. L’analyse d’aortes d’un modèle murin a été le point de départ pour le développement des méthodes car ce sont des échantillons mieux contrôlés. En corrélant les données acquises en mode d’ionisation positive et négative, l’IMS en 3D a permis de démontrer une accumulation des phospholipides dans les sinus aortiques. De plus, l’IMS par AgLDI a mis en évidence une localisation différentielle des acides gras libres, du cholestérol, des esters du cholestérol et des triglycérides. La segmentation multivariée des signaux lipidiques suite à l’analyse par IMS d’une carotide humaine démontre une histologie moléculaire corrélée avec le degré de sténose de l’artère. Ces recherches aident à mieux comprendre la complexité biologique de l’athérosclérose et peuvent possiblement prédire le développement de certains cas cliniques. La métastase au foie du cancer colorectal (Colorectal cancer liver metastasis en anglais, CRCLM) est la maladie métastatique du cancer colorectal primaire, un des cancers le plus fréquent au monde. L’évaluation et le pronostic des tumeurs CRCLM sont effectués avec l’histopathologie avec une marge d’erreur. Nous avons utilisé l’IMS des lipides pour identifier les compartiments histologiques du CRCLM et extraire leurs signatures lipidiques. En exploitant ces signatures moléculaires, nous avons pu déterminer un score histopathologique quantitatif et objectif et qui corrèle avec le pronostic. De plus, par la dissection des signatures lipidiques, nous avons identifié des espèces lipidiques individuelles qui sont discriminants des différentes histologies du CRCLM et qui peuvent potentiellement être utilisées comme des biomarqueurs pour la détermination de la réponse à la thérapie. Plus spécifiquement, nous avons trouvé une série de plasmalogènes et sphingolipides qui permettent de distinguer deux différents types de nécrose (infarct-like necrosis et usual necrosis en anglais, ILN et UN, respectivement). L’ILN est associé avec la réponse aux traitements chimiothérapiques, alors que l’UN est associé au fonctionnement normal de la tumeur.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of the Design by Analysis (DBA) route is a modern trend in pressure vessel and piping international codes in mechanical engineering. However, to apply the DBA to structures under variable mechanical and thermal loads, it is necessary to assure that the plastic collapse modes, alternate plasticity and incremental collapse (with instantaneous plastic collapse as a particular case), be precluded. The tool available to achieve this target is the shakedown theory. Unfortunately, the practical numerical applications of the shakedown theory result in very large nonlinear optimization problems with nonlinear constraints. Precise, robust and efficient algorithms and finite elements to solve this problem in finite dimension has been a more recent achievements. However, to solve real problems in an industrial level, it is necessary also to consider more realistic material properties as well as to accomplish 3D analysis. Limited kinematic hardening, is a typical property of the usual steels and it should be considered in realistic applications. In this paper, a new finite element with internal thermodynamical variables to model kinematic hardening materials is developed and tested. This element is a mixed ten nodes tetrahedron and through an appropriate change of variables is possible to embed it in a shakedown analysis software developed by Zouain and co-workers for elastic ideally-plastic materials, and then use it to perform 3D shakedown analysis in cases with limited kinematic hardening materials

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mathematical models are increasingly used in environmental science thus increasing the importance of uncertainty and sensitivity analyses. In the present study, an iterative parameter estimation and identifiability analysis methodology is applied to an atmospheric model – the Operational Street Pollution Model (OSPMr). To assess the predictive validity of the model, the data is split into an estimation and a prediction data set using two data splitting approaches and data preparation techniques (clustering and outlier detection) are analysed. The sensitivity analysis, being part of the identifiability analysis, showed that some model parameters were significantly more sensitive than others. The application of the determined optimal parameter values was shown to succesfully equilibrate the model biases among the individual streets and species. It was as well shown that the frequentist approach applied for the uncertainty calculations underestimated the parameter uncertainties. The model parameter uncertainty was qualitatively assessed to be significant, and reduction strategies were identified.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Some authors have shown the need of understanding the technological structuring process in contemporary firms. From this perspective, the software industry is a very important element because it provides products and services directly to many organizations from many fields. In this case, the Brazilian software industry has some peculiarities that distinguish it from other industries located in developed countries, which makes its understanding even more relevant. There is evidence that local firms take different strategies and structural configurations to enter into a market naturally dominated by large multinational firms. Therefore, this study aims to understand not only the structural configurations assumed by domestic firms but also the dynamic and the process that lead to these different configurations. To do so, this PhD dissertation investigates the institutional environment, its entities and the isomorphic movements, by employing an exploratory, descriptive and explanatory multiple cases study. Eight software development companies from the Recife's information technology Cluster were visited. Also, a form was applied and an interview with one of the main firm s professional was conducted. Although the study is predominantly qualitative, part of the data was analyzed through charts and graphs, providing a companies and environment overview that was very useful to analysis done through the interviews interpretation. As a result, it was realized that companies are structured around hybrids business models from two ideal types of software development companies, which are: software factory and technology-based company. Regarding the development process, it was found that there is a balanced distribution between the traditional and agile development paradigm. Among the traditional methodologies, the Rational Unified Process (RUP) is predominant. The Scrum is the most used methodology among the organizations based on the Agile Manifesto's principles. Regarding the structuring process, each institutional entity acts in such way that generates different isomorphic pressure. Emphasis was given to entities such as customers, research agencies, clusters, market-leading businesses, public universities, incubators, software industry organizations, technology vendors, development tool suppliers and manager s school and background because they relate themselves in a close way with the software firms. About this relationship, a dual and bilateral influence was found. Finally, the structuring level of the organizational field has been also identified as low, which gives a chance to organizational actors of acting independently

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To quantitatively analyze and compare the fundoscopic features between fellow eyes of retinal angiomatous proliferation and typical exudative age-related macular degeneration and to identify possible predictors of neovascularization. METHODS: Retrospective case-control study. Seventy-nine fellow eyes of unilateral retinal angiomatous proliferation (n = 40) and typical exudative age-related macular degeneration (n = 39) were included. Fundoscopic features of the fellow eyes were assessed using digital color fundus photographs taken at the time of diagnosis of neovascularization in the first affected eye. Grading was performed by two independent graders using RetmarkerAMD, a computer-assisted grading software based on the International Classification and Grading System for age-related macular degeneration. RESULTS: Baseline total number and area (square micrometers) of drusen in the central 1,000, 3,000, and 6,000 μm were considerably inferior in the fellow eyes of retinal angiomatous proliferation, with statistically significant differences (P < 0.05) observed in virtually every location (1,000, 3,000, and 6,000 μm). A soft drusen (≥125 μm) area >510,196 μm2 in the central 6,000 μm was associated with an increased risk of neovascularization (hazard ratio, 4.35; 95% confidence interval [1.56-12.15]; P = 0.005). CONCLUSION: Baseline fundoscopic features of the fellow eye differ significantly between retinal angiomatous proliferation and typical exudative age-related macular degeneration. A large area (>510,196 μm2) of soft drusen in the central 6,000 μm confers a significantly higher risk of neovascularization and should be considered as a phenotypic risk factor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During this thesis work a coupled thermo-mechanical finite element model (FEM) was builtto simulate hot rolling in the blooming mill at Sandvik Materials Technology (SMT) inSandviken. The blooming mill is the first in a long line of processes that continuously or ingotcast ingots are subjected to before becoming finished products. The aim of this thesis work was twofold. The first was to create a parameterized finiteelement (FE) model of the blooming mill. The commercial FE software package MSCMarc/Mentat was used to create this model and the programing language Python was used toparameterize it. Second, two different pass schedules (A and B) were studied and comparedusing the model. The two pass series were evaluated with focus on their ability to healcentreline porosity, i.e. to close voids in the centre of the ingot. This evaluation was made by studying the hydrostatic stress (σm), the von Mises stress (σeq)and the plastic strain (εp) in the centre of the ingot. From these parameters the stress triaxiality(Tx) and the hydrostatic integration parameter (Gm) were calculated for each pass in bothseries using two different transportation times (30 and 150 s) from the furnace. The relationbetween Gm and an analytical parameter (Δ) was also studied. This parameter is the ratiobetween the mean height of the ingot and the contact length between the rolls and the ingot,which is useful as a rule of thumb to determine the homogeneity or penetration of strain for aspecific pass. The pass series designed with fewer passes (B), many with greater reduction, was shown toachieve better void closure theoretically. It was also shown that a temperature gradient, whichis the result of a longer holding time between the furnace and the blooming mill leads toimproved void closure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Il seguente lavoro di tesi è nato durante un’attività di stage della durata di 7 mesi svolto all’interno della divisione Tea&Coffe di IMA S.p.A., azienda leader mondiale nella produzione di macchine automatiche per il confezionamento di prodotti farmaceutici, cosmetici, alimentari, tè e caffè. Le attività svolte si collocano all’interno di un progetto avviato da IMA per promuovere il passaggio ad un modello di industria necessariamente più evoluta, facendo leva sull’attitudine ad integrare e sviluppare nuove conoscenze e nuove tecnologie interdisciplinari e, allo stesso tempo, di massimizzare la sinergia tra le dimensioni tecnica ed economica, comportando una reale riduzione di sprechi nella filiera produttiva, commerciale ed ambientale. I moderni impianti di produzione devono infatti affrontare una sfida che li vede alla continua ricerca della produttività, ovvero di una produzione che remuneri velocemente e con ampi margini gli investimenti effettuati, della qualità dei prodotti e dei processi di produzione, ovvero della garanzia di soddisfacimento delle aspettative espresse ed inespresse del cliente, e della sicurezza per la salvaguardia della collettività e dell’ambiente. L’obiettivo di questo elaborato è stato quello di effettuare lo studio affidabilistico di una macchina automatica per la produzione di bustine di tè al fine di poterne studiare il suo comportamento al guasto e di elaborare in un secondo momento le politiche manutentive ottimizzate che ne permettano una gestione più efficiente. In questo ambito la macchina è stata scomposta in gruppi e sono stati esaminati tutti i pezzi di ricambio che sono stati richiesti in un arco temporale di durata pari a dieci anni, il fine è quello di poter individuare ed effettuare un’analisi affidabilistica dei componenti critici per poi procedere, attraverso l’uso di piattaforme software quali Weibull++ e Blocksim, col modellarne le distribuzioni statistiche e simulare il funzionamento del sistema nel suo complesso.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Variability management is one of the major challenges in software product line adoption, since it needs to be efficiently managed at various levels of the software product line development process (e.g., requirement analysis, design, implementation, etc.). One of the main challenges within variability management is the handling and effective visualization of large-scale (industry-size) models, which in many projects, can reach the order of thousands, along with the dependency relationships that exist among them. These have raised many concerns regarding the scalability of current variability management tools and techniques and their lack of industrial adoption. To address the scalability issues, this work employed a combination of quantitative and qualitative research methods to identify the reasons behind the limited scalability of existing variability management tools and techniques. In addition to producing a comprehensive catalogue of existing tools, the outcome form this stage helped understand the major limitations of existing tools. Based on the findings, a novel approach was created for managing variability that employed two main principles for supporting scalability. First, the separation-of-concerns principle was employed by creating multiple views of variability models to alleviate information overload. Second, hyperbolic trees were used to visualise models (compared to Euclidian space trees traditionally used). The result was an approach that can represent models encompassing hundreds of variability points and complex relationships. These concepts were demonstrated by implementing them in an existing variability management tool and using it to model a real-life product line with over a thousand variability points. Finally, in order to assess the work, an evaluation framework was designed based on various established usability assessment best practices and standards. The framework was then used with several case studies to benchmark the performance of this work against other existing tools.