938 resultados para source analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern software applications are becoming more dependent on database management systems (DBMSs). DBMSs are usually used as black boxes by software developers. For example, Object-Relational Mapping (ORM) is one of the most popular database abstraction approaches that developers use nowadays. Using ORM, objects in Object-Oriented languages are mapped to records in the database, and object manipulations are automatically translated to SQL queries. As a result of such conceptual abstraction, developers do not need deep knowledge of databases; however, all too often this abstraction leads to inefficient and incorrect database access code. Thus, this thesis proposes a series of approaches to improve the performance of database-centric software applications that are implemented using ORM. Our approaches focus on troubleshooting and detecting inefficient (i.e., performance problems) database accesses in the source code, and we rank the detected problems based on their severity. We first conduct an empirical study on the maintenance of ORM code in both open source and industrial applications. We find that ORM performance-related configurations are rarely tuned in practice, and there is a need for tools that can help improve/tune the performance of ORM-based applications. Thus, we propose approaches along two dimensions to help developers improve the performance of ORM-based applications: 1) helping developers write more performant ORM code; and 2) helping developers configure ORM configurations. To provide tooling support to developers, we first propose static analysis approaches to detect performance anti-patterns in the source code. We automatically rank the detected anti-pattern instances according to their performance impacts. Our study finds that by resolving the detected anti-patterns, the application performance can be improved by 34% on average. We then discuss our experience and lessons learned when integrating our anti-pattern detection tool into industrial practice. We hope our experience can help improve the industrial adoption of future research tools. However, as static analysis approaches are prone to false positives and lack runtime information, we also propose dynamic analysis approaches to further help developers improve the performance of their database access code. We propose automated approaches to detect redundant data access anti-patterns in the database access code, and our study finds that resolving such redundant data access anti-patterns can improve application performance by an average of 17%. Finally, we propose an automated approach to tune performance-related ORM configurations using both static and dynamic analysis. Our study shows that our approach can help improve application throughput by 27--138%. Through our case studies on real-world applications, we show that all of our proposed approaches can provide valuable support to developers and help improve application performance significantly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mineral and chemical composition of alluvial Upper-Pleistocene deposits from the Alto Guadalquivir Basin (SE Spain) were studied as a tool to identify sedimentary and geomorphological processes controlling its formation. Sediments located upstream, in the north-eastern sector of the basin, are rich in dolomite, illite, MgO and KB2BO. Downstream, sediments at the sequence base are enriched in calcite, smectite and CaO, whereas the upper sediments have similar features to those from upstream. Elevated rare-earth elements (REE) values can be related to low carbonate content in the sediments and the increase of silicate material produced and concentrated during soil formation processes in the neighbouring source areas. Two mineralogical and geochemical signatures related to different sediment source areas were identified. Basal levels were deposited during a predominantly erosive initial stage, and are mainly composed of calcite and smectite materials enriched in REE coming from Neogene marls and limestones. Then the deposition of the upper levels of the alluvial sequences, made of dolomite and illitic materials depleted in REE coming from the surrounding Sierra de Cazorla area took place during a less erosive later stage of the fluvial system. Such modification was responsible of the change in the mineralogical and geochemical composition of the alluvial sediments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A small scale sample nuclear waste package, consisting of a 28 mm diameter uranium penny encased in grout, was imaged by absorption contrast radiography using a single pulse exposure from an X-ray source driven by a high-power laser. The Vulcan laser was used to deliver a focused pulse of photons to a tantalum foil, in order to generate a bright burst of highly penetrating X-rays (with energy >500 keV), with a source size of <0.5 mm. BAS-TR and BAS-SR image plates were used for image capture, alongside a newly developed Thalium doped Caesium Iodide scintillator-based detector coupled to CCD chips. The uranium penny was clearly resolved to sub-mm accuracy over a 30 cm2 scan area from a single shot acquisition. In addition, neutron generation was demonstrated in situ with the X-ray beam, with a single shot, thus demonstrating the potential for multi-modal criticality testing of waste materials. This feasibility study successfully demonstrated non-destructive radiography of encapsulated, high density, nuclear material. With recent developments of high-power laser systems, to 10 Hz operation, a laser-driven multi-modal beamline for waste monitoring applications is envisioned.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A grid-connected DFIG for wind power generation can affect power system small-signal angular stability in two ways: by changing the system load flow condition and dynamically interacting with synchronous generators (SGs). This paper presents the application of conventional method of damping torque analysis (DTA) to examine the effect of DFIG’s dynamic interactions with SGs on the small-signal angular stability. It shows that the effect is due to the dynamic variation of power exchange between the DFIG and power system and can be estimated approximately by the DTA. Consequently, if the DFIG is modelled as a constant power source when the effect of zero dynamic interactions is assumed, the impact of change of load flow brought about by the DFIG can be determined. Thus the total effect of DFIG can be estimated from the result of DTA added on that of constant power source model. Applications of the DTA method proposed in the paper are discussed. An example of multi-machine power systems with grid-connected DFIGs are presented to demonstrate and validate the DTA method proposed and conclusions obtained in the paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper studies the impact of in-phase and quadrature-phase imbalance (IQI) in two-way amplify-and-forward (AF) relaying systems. In particular, the effective signal-to-interference-plus-noise ratio (SINR) is derived for each source node, considering four different linear detection schemes, namely, uncompensated (Uncomp) scheme, maximal-ratio-combining (MRC), zero-forcing (ZF) and minimum mean-square error (MMSE) based schemes. For each proposed scheme, the outage probability (OP) is investigated over independent, non-identically distributed Nakagami-m fading channels, and exact closed-form expressions are derived for the first three schemes. Based on the closed-form OP expressions, an adaptive detection mode switching scheme is designed for minimizing the OP of both sources. An important observation is that, regardless of the channel conditions and transmit powers, the ZF-based scheme should always be selected if the target SINR is larger than 3 (4.77dB), while the MRC-based scheme should be avoided if the target SINR is larger than 0.38 (-4.20dB).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Android is becoming ubiquitous and currently has the largest share of the mobile OS market with billions of application downloads from the official app market. It has also become the platform most targeted by mobile malware that are becoming more sophisticated to evade state-of-the-art detection approaches. Many Android malware families employ obfuscation techniques in order to avoid detection and this may defeat static analysis based approaches. Dynamic analysis on the other hand may be used to overcome this limitation. Hence in this paper we propose DynaLog, a dynamic analysis based framework for characterizing Android applications. The framework provides the capability to analyse the behaviour of applications based on an extensive number of dynamic features. It provides an automated platform for mass analysis and characterization of apps that is useful for quickly identifying and isolating malicious applications. The DynaLog framework leverages existing open source tools to extract and log high level behaviours, API calls, and critical events that can be used to explore the characteristics of an application, thus providing an extensible dynamic analysis platform for detecting Android malware. DynaLog is evaluated using real malware samples and clean applications demonstrating its capabilities for effective analysis and detection of malicious applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyze four extreme AGN transients to explore the possibility that they are caused by rare, high-amplitude microlensing events. These previously unknown type-I AGN are located in the redshift range 0.6-1.1 and show changes of > 1.5 magnitudes in the g-band on a timescale of ~years. Multi-epoch optical spectroscopy, from the William Herschel Telescope, shows clear differential variability in the broad line fluxes with respect to the continuum changes and also evolution in the line profiles. In two cases a simple point-source, point-lens microlensing model provides an excellent match to the long-term variability seen in these objects. For both models the parameter constraints are consistent with the microlensing being due to an intervening stellar mass object but as yet there is no confirmation of the presence of an intervening galaxy. The models predict a peak amplification of 10.3/13.5 and an Einstein timescale of 7.5/10.8 years respectively. In one case the data also allow constraints on the size of the CIII] emitting region, with some simplifying assumptions, to to be ~1.0-6.5 light-days and a lower limit on the size of the MgII emitting region to be > 9 light-days (half-light radii). This CIII] radius is perhaps surprisingly small. In the remaining two objects there is spectroscopic evidence for an intervening absorber but the extra structure seen in the lightcurves requires a more complex lensing scenario to adequately explain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the epoch when the first collapsed structures formed (6<z<50) our Universe went through an extended period of changes. Some of the radiation from the first stars and accreting black holes in those structures escaped and changed the state of the Intergalactic Medium (IGM). The era of this global phase change in which the state of the IGM was transformed from cold and neutral to warm and ionized, is called the Epoch of Reionization.In this thesis we focus on numerical methods to calculate the effects of this escaping radiation. We start by considering the performance of the cosmological radiative transfer code C2-Ray. We find that although this code efficiently and accurately solves for the changes in the ionized fractions, it can yield inaccurate results for the temperature changes. We introduce two new elements to improve the code. The first element, an adaptive time step algorithm, quickly determines an optimal time step by only considering the computational cells relevant for this determination. The second element, asynchronous evolution, allows different cells to evolve with different time steps. An important constituent of methods to calculate the effects of ionizing radiation is the transport of photons through the computational domain or ``ray-tracing''. We devise a novel ray tracing method called PYRAMID which uses a new geometry - the pyramidal geometry. This geometry shares properties with both the standard Cartesian and spherical geometries. This makes it on the one hand easy to use in conjunction with a Cartesian grid and on the other hand ideally suited to trace radiation from a radially emitting source. A time-dependent photoionization calculation not only requires tracing the path of photons but also solving the coupled set of photoionization and thermal equations. Several different solvers for these equations are in use in cosmological radiative transfer codes. We conduct a detailed and quantitative comparison of four different standard solvers in which we evaluate how their accuracy depends on the choice of the time step. This comparison shows that their performance can be characterized by two simple parameters and that the C2-Ray generally performs best.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An emergency lowering system for use in safety critical crane applications is discussed. The system is used to safely lower the payload of a crane in case of an electric blackout. The system is based on a backup power source, which is used to operate the crane while the regular supply is not available. The system enables both horizontal and vertical movements of the crane. Two different configurations for building the system are described, one with an uninterruptible power source (UPS) or a diesel generator connected in parallel to the crane’s power supply and one with a customized energy storage connected to the intermediate DC-link in the crane. In order to be able to size the backup power source, the power required during emergency lowering needs to be understood. A simulation model is used to study and optimize the power used during emergency lowering. The simulation model and optimizations are verified in a test hoist. Simulation results are presented with non-optimized and optimized controls for two example applications: a paper roll crane and a steel mill ladle crane. The optimizations are found to significantly reduce the required power for the crane movements during emergency lowering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Localisation is the process of taking a product and adapting it to fit the culture in question. This usually involves making it both linguistically and culturally appropriate for the target audience. While there are many areas in video game translations where localisation holds a factor, this study will focus on localisation changes in the personalities of fictional characters between the original Japanese version and the English localised version of the video game Final Fantasy XIV: A Realm Reborn and its expansion Heavensward for PC, PS3 and PS4. With this in mind, specific examples are examined using Satoshi Kinsui's work on yakuwarigo, role language as the main framework for this study. Five non-playable characters were profiled and had each of their dialogues transcribed for a comparative analysis. This included the original Japanese text, the officially localised English text and a translation of the original Japanese text done by myself. Each character were also given a short summary and a reasoned speculation on why these localisation changes might have occurred. The result shows that there were instances where some translations had been deliberately adjusted to ensure that the content did not cause any problematic issues to players overseas. This could be reasoned out that some of the Japanese role languages displayed by characters in this game could potentially cause dispute among the western audience. In conclusion, the study shows that localisation can be a difficult process that not only requires a translator's knowledge of the source and target language, but also display some creativity in writing ability to ensure that players will have a comparable experience without causing a rift in the fanbase.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La spectrométrie de masse mesure la masse des ions selon leur rapport masse sur charge. Cette technique est employée dans plusieurs domaines et peut analyser des mélanges complexes. L’imagerie par spectrométrie de masse (Imaging Mass Spectrometry en anglais, IMS), une branche de la spectrométrie de masse, permet l’analyse des ions sur une surface, tout en conservant l’organisation spatiale des ions détectés. Jusqu’à présent, les échantillons les plus étudiés en IMS sont des sections tissulaires végétales ou animales. Parmi les molécules couramment analysées par l’IMS, les lipides ont suscité beaucoup d'intérêt. Les lipides sont impliqués dans les maladies et le fonctionnement normal des cellules; ils forment la membrane cellulaire et ont plusieurs rôles, comme celui de réguler des événements cellulaires. Considérant l’implication des lipides dans la biologie et la capacité du MALDI IMS à les analyser, nous avons développé des stratégies analytiques pour la manipulation des échantillons et l’analyse de larges ensembles de données lipidiques. La dégradation des lipides est très importante dans l’industrie alimentaire. De la même façon, les lipides des sections tissulaires risquent de se dégrader. Leurs produits de dégradation peuvent donc introduire des artefacts dans l’analyse IMS ainsi que la perte d’espèces lipidiques pouvant nuire à la précision des mesures d’abondance. Puisque les lipides oxydés sont aussi des médiateurs importants dans le développement de plusieurs maladies, leur réelle préservation devient donc critique. Dans les études multi-institutionnelles où les échantillons sont souvent transportés d’un emplacement à l’autre, des protocoles adaptés et validés, et des mesures de dégradation sont nécessaires. Nos principaux résultats sont les suivants : un accroissement en fonction du temps des phospholipides oxydés et des lysophospholipides dans des conditions ambiantes, une diminution de la présence des lipides ayant des acides gras insaturés et un effet inhibitoire sur ses phénomènes de la conservation des sections au froid sous N2. A température et atmosphère ambiantes, les phospholipides sont oxydés sur une échelle de temps typique d’une préparation IMS normale (~30 minutes). Les phospholipides sont aussi décomposés en lysophospholipides sur une échelle de temps de plusieurs jours. La validation d’une méthode de manipulation d’échantillon est d’autant plus importante lorsqu’il s’agit d’analyser un plus grand nombre d’échantillons. L’athérosclérose est une maladie cardiovasculaire induite par l’accumulation de matériel cellulaire sur la paroi artérielle. Puisque l’athérosclérose est un phénomène en trois dimension (3D), l'IMS 3D en série devient donc utile, d'une part, car elle a la capacité à localiser les molécules sur la longueur totale d’une plaque athéromateuse et, d'autre part, car elle peut identifier des mécanismes moléculaires du développement ou de la rupture des plaques. l'IMS 3D en série fait face à certains défis spécifiques, dont beaucoup se rapportent simplement à la reconstruction en 3D et à l’interprétation de la reconstruction moléculaire en temps réel. En tenant compte de ces objectifs et en utilisant l’IMS des lipides pour l’étude des plaques d’athérosclérose d’une carotide humaine et d’un modèle murin d’athérosclérose, nous avons élaboré des méthodes «open-source» pour la reconstruction des données de l’IMS en 3D. Notre méthodologie fournit un moyen d’obtenir des visualisations de haute qualité et démontre une stratégie pour l’interprétation rapide des données de l’IMS 3D par la segmentation multivariée. L’analyse d’aortes d’un modèle murin a été le point de départ pour le développement des méthodes car ce sont des échantillons mieux contrôlés. En corrélant les données acquises en mode d’ionisation positive et négative, l’IMS en 3D a permis de démontrer une accumulation des phospholipides dans les sinus aortiques. De plus, l’IMS par AgLDI a mis en évidence une localisation différentielle des acides gras libres, du cholestérol, des esters du cholestérol et des triglycérides. La segmentation multivariée des signaux lipidiques suite à l’analyse par IMS d’une carotide humaine démontre une histologie moléculaire corrélée avec le degré de sténose de l’artère. Ces recherches aident à mieux comprendre la complexité biologique de l’athérosclérose et peuvent possiblement prédire le développement de certains cas cliniques. La métastase au foie du cancer colorectal (Colorectal cancer liver metastasis en anglais, CRCLM) est la maladie métastatique du cancer colorectal primaire, un des cancers le plus fréquent au monde. L’évaluation et le pronostic des tumeurs CRCLM sont effectués avec l’histopathologie avec une marge d’erreur. Nous avons utilisé l’IMS des lipides pour identifier les compartiments histologiques du CRCLM et extraire leurs signatures lipidiques. En exploitant ces signatures moléculaires, nous avons pu déterminer un score histopathologique quantitatif et objectif et qui corrèle avec le pronostic. De plus, par la dissection des signatures lipidiques, nous avons identifié des espèces lipidiques individuelles qui sont discriminants des différentes histologies du CRCLM et qui peuvent potentiellement être utilisées comme des biomarqueurs pour la détermination de la réponse à la thérapie. Plus spécifiquement, nous avons trouvé une série de plasmalogènes et sphingolipides qui permettent de distinguer deux différents types de nécrose (infarct-like necrosis et usual necrosis en anglais, ILN et UN, respectivement). L’ILN est associé avec la réponse aux traitements chimiothérapiques, alors que l’UN est associé au fonctionnement normal de la tumeur.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A high-resolution geochemical record of a 120 cm black shale interval deposited during the Coniacian-Santonian Oceanic Anoxic Event 3 (ODP Leg 207, Site 1261, Demerara Rise) has been constructed to provide detailed insight into rapid changes in deep ocean and sediment paleo-redox conditions. High contents of organic matter, sulfur and redox-sensitive trace metals (Cd, Mo, V, Zn), as well as continuous lamination, point to deposition under consistently oxygen-free and largely sulfidic bottom water conditions. However, rapid and cyclic changes in deep ocean redox are documented by short-term (~15-20 ka) intervals with decreased total organic carbon (TOC), S and redox-sensitive trace metal contents, and in particular pronounced phosphorus peaks (up to 2.5 wt% P) associated with elevated Fe oxide contents. Sequential iron and phosphate extractions confirm that P is dominantly bound to iron oxides and incorporated into authigenic apatite. Preservation of this Fe-P coupling in an otherwise sulfidic depositional environment (as indicated by Fe speciation and high amounts of sulfurized organic matter) may be unexpected, and provides evidence for temporarily non-sulfidic bottom waters. However, there is no evidence for deposition under oxic conditions. Instead, sulfidic conditions were punctuated by periods of anoxic, non-sulfidic bottom waters. During these periods, phosphate was effectively scavenged during precipitation of iron (oxyhydr)oxides in the upper water column, and was subsequently deposited and largely preserved at the sea floor. After ~15-25 ka, sulfidic bottom water conditions were re-established, leading to the initial precipitation of CdS, ZnS and pyrite. Subsequently, increasing concentrations of H2S in the water column led to extensive formation of sulfurized organic matter, which effectively scavenged particle-reactive Mo complexes (thiomolybdates). At Site 1261, sulfidic bottom waters lasted for ?90-100 ka, followed by another period of anoxic, non-sulfidic conditions lasting for ~15-20 ka. The observed cyclicity at the lower end of the redox scale may have been triggered by repeated incursions of more oxygenated surface- to mid-waters from the South Atlantic resulting in a lowering of the oxic-anoxic chemocline in the water column. Alternatively, sea water sulfate might have been stripped by long-lasting high rates of sulfate reduction, removing the ultimate source for HS**- production.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les déficits cognitifs sont centraux à la psychose et sont observables plusieurs années avant le premier épisode psychotique. L’atteinte de la mémoire épisodique est fréquemment identifiée comme une des plus sévères, tant chez les patients qu’avant l’apparition de la pathologie chez des populations à risque. Chez les patients psychotiques, l’étude neuropsychologique des processus mnésiques a permis de mieux comprendre l’origine de cette atteinte. Une altération des processus de mémoire de source qui permettent d’associer un souvenir à son origine a ainsi été identifiée et a été associée aux symptômes positifs de psychose, principalement aux hallucinations. La mémoire de source de même que la présence de symptômes sous-cliniques n’ont pourtant jamais été investiguées avant l’apparition de la maladie chez une population à haut risque génétique de psychose (HRG). Or, leur étude permettrait de voir si les déficits en mémoire de source de même que le vécu d’expériences hallucinatoires sont associés à l’apparition de la psychose ou s’ils en précèdent l’émergence, constituant alors des indicateurs précoces de pathologie. Afin d’étudier cette question, trois principaux objectifs ont été poursuivis par la présente thèse : 1) caractériser le fonctionnement de la mémoire de source chez une population HRG afin d’observer si une atteinte de ce processus précède l’apparition de la maladie, 2) évaluer si des manifestations sous-cliniques de symptômes psychotiques, soit les expériences hallucinatoires, sont identifiables chez une population à risque et 3) investiguer si un lien est présent entre le fonctionnement en mémoire de source et la symptomatologie sous-clinique chez une population à risque, à l’instar de ce qui est documenté chez les patients. Les résultats de la thèse ont permis de démontrer que les HRG présentent une atteinte de la mémoire de source ciblée à l’attribution du contexte temporel des souvenirs, ainsi que des distorsions mnésiques qui se manifestent par une fragmentation des souvenirs et par une défaillance de la métacognition en mémoire. Il a également été observé que les expériences hallucinatoires sous-cliniques étaient plus fréquentes chez les HRG. Des associations ont été documentées entre certaines distorsions en mémoire et la propension à halluciner. Ces résultats permettent d’identifier de nouveaux indicateurs cliniques et cognitifs du risque de développer une psychose et permettent de soulever des hypothèses liant l’attribution de la source interne-externe de l’information et le développement de la maladie. Les implications empiriques, théoriques, méthodologiques et cliniques de la thèse sont discutées.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Of late, decrease in mineral oil supplies has stimulated research on use of biomass as an alternative energy source. Climate change has brought problems such as increased drought and erratic rains. This, together with a rise in land degeneration problems with concomitant loss in soil fertility has inspired the scientific world to look for alternative bio-energy species. Euphorbia tirucalli L., a tree with C3/CAM metabolism in leaves/stem, can be cultivated on marginal, arid land and could be a good alternative source of biofuel. We analyzed a broad variety of E. tirucalli plants collected from different countries for their genetic diversity using AFLP. Physiological responses to induced drought stress were determined in a number of genotypes by monitoring growth parameters and influence on photosynthesis. For future breeding of economically interesting genotypes, rubber content and biogas production were quantified. Cluster analysis shows that the studied genotypes are divided into two groups, African and mostly non-African genotypes. Different genotypes respond significantly different to various levels of water. Malate measurement indicates that there is induction of CAM in leaves following drought stress. Rubber content varies strongly between genotypes. An investigation of the biogas production capacities of six E. tirucalli genotypes reveals biogas yields higher than from rapeseed but lower than maize silage.