929 resultados para source code analysis
Resumo:
The area west of the Antarctic Peninsula is a key region for studying and understanding the history of glaciation in the southern high latitudes during the Neogene with respect to variations of the western Antarctic continental ice sheet, variable sea-ice cover, induced eustatic sea level change, as well as consequences for the global climatic system (Barker, Camerlenghi, Acton, et al., 1999). Sites 1095, 1096, and 1101 were drilled on sediment drifts forming the continental rise to examine the nature and composition of sediments deposited under the influence of the Antarctic Peninsula ice sheet, which has repeatedly advanced to the shelf edge and subsequently released glacially eroded material on the continental shelf and slope (Barker et al., 1999). Mass gravity processes on the slope are responsible for downslope sediment transport by turbidity currents within a channel system between the drifts. Furthermore, bottom currents redistribute the sediments, which leads to final build up of drift bodies (Rebesco et al., 1998). The high-resolution sedimentary sequences on the continental rise can be used to document the variability of continental glaciation and, therefore, allow us to assess the main factors that control the sediment transport and the depositional processes during glaciation periods and their relationship to glacio-eustatic sea level changes. Site 1095 lies in 3840 m of water in a distal position on the northwestern lower flank of Drift 7, whereas Site 1096 lies in 3152 m of water in a more proximal position within Drift 7. Site 1101 is located at 3509 m water depth on the northwestern flank of Drift 4. All three sites have high sedimentation rates. The oldest sediments were recovered at Site 1095 (late Miocene; 9.7 Ma), whereas sediments of Pliocene age were recovered at Site 1096 (4.7 Ma) and at Site 1101 (3.5 Ma). The purpose of this work is to provide a data set of bulk sediment parameters such as CaCO3, total organic carbon (TOC), and coarse-fraction mass percentage (>63 µm) measured on the sediments collected from the continental rise of the western Antarctic Peninsula (Holes 1095A, 1095B, 1096A, 1096B, 1096C, and 1101A). This information can be used to understand the complex depositional processes and their implication for variations in the climatic system of the western Pacific Antarctic margin since 9.7 Ma (late Miocene). Coarse-fraction particles (125-500 µm) from the late Pliocene and Pleistocene (4.0 Ma to recent) sediments recovered from Hole 1095A were microscopically analyzed to gather more detailed information about their variability and composition through time. These data can yield information about changes in potential source regions of the glacially eroded material that has been transported during repeated periods of ice-sheet movements on the shelf.
Resumo:
This paper presents a vision that allows the combined use of model-driven engineering, run-time monitoring, and animation for the development and analysis of components in real-time embedded systems. Key building block in the tool environment supporting this vision is a highly-customizable code generation process. Customization is performed via a configuration specification which describes the ways in which input is provided to the component, the ways in which run-time execution information can be observed, and how these observations drive animation tools. The environment is envisioned to be suitable for different activities ranging from quality assurance to supporting certification, teaching, and outreach and will be built exclusively with open source tools to increase impact. A preliminary prototype implementation is described.
Resumo:
During the epoch when the first collapsed structures formed (6<z<50) our Universe went through an extended period of changes. Some of the radiation from the first stars and accreting black holes in those structures escaped and changed the state of the Intergalactic Medium (IGM). The era of this global phase change in which the state of the IGM was transformed from cold and neutral to warm and ionized, is called the Epoch of Reionization.In this thesis we focus on numerical methods to calculate the effects of this escaping radiation. We start by considering the performance of the cosmological radiative transfer code C2-Ray. We find that although this code efficiently and accurately solves for the changes in the ionized fractions, it can yield inaccurate results for the temperature changes. We introduce two new elements to improve the code. The first element, an adaptive time step algorithm, quickly determines an optimal time step by only considering the computational cells relevant for this determination. The second element, asynchronous evolution, allows different cells to evolve with different time steps. An important constituent of methods to calculate the effects of ionizing radiation is the transport of photons through the computational domain or ``ray-tracing''. We devise a novel ray tracing method called PYRAMID which uses a new geometry - the pyramidal geometry. This geometry shares properties with both the standard Cartesian and spherical geometries. This makes it on the one hand easy to use in conjunction with a Cartesian grid and on the other hand ideally suited to trace radiation from a radially emitting source. A time-dependent photoionization calculation not only requires tracing the path of photons but also solving the coupled set of photoionization and thermal equations. Several different solvers for these equations are in use in cosmological radiative transfer codes. We conduct a detailed and quantitative comparison of four different standard solvers in which we evaluate how their accuracy depends on the choice of the time step. This comparison shows that their performance can be characterized by two simple parameters and that the C2-Ray generally performs best.
Resumo:
Abstract not available
Resumo:
The Graphical User Interface (GUI) is an integral component of contemporary computer software. A stable and reliable GUI is necessary for correct functioning of software applications. Comprehensive verification of the GUI is a routine part of most software development life-cycles. The input space of a GUI is typically large, making exhaustive verification difficult. GUI defects are often revealed by exercising parts of the GUI that interact with each other. It is challenging for a verification method to drive the GUI into states that might contain defects. In recent years, model-based methods, that target specific GUI interactions, have been developed. These methods create a formal model of the GUI’s input space from specification of the GUI, visible GUI behaviors and static analysis of the GUI’s program-code. GUIs are typically dynamic in nature, whose user-visible state is guided by underlying program-code and dynamic program-state. This research extends existing model-based GUI testing techniques by modelling interactions between the visible GUI of a GUI-based software and its underlying program-code. The new model is able to, efficiently and effectively, test the GUI in ways that were not possible using existing methods. The thesis is this: Long, useful GUI testcases can be created by examining the interactions between the GUI, of a GUI-based application, and its program-code. To explore this thesis, a model-based GUI testing approach is formulated and evaluated. In this approach, program-code level interactions between GUI event handlers will be examined, modelled and deployed for constructing long GUI testcases. These testcases are able to drive the GUI into states that were not possible using existing models. Implementation and evaluation has been conducted using GUITAR, a fully-automated, open-source GUI testing framework.
Resumo:
The purpose of this paper is twofold. Firstly it presents a preliminary and ethnomethodologically-informed analysis of the way in which the growing structure of a particular program's code was ongoingly derived from its earliest stages. This was motivated by an interest in how the detailed structure of completed program `emerged from nothing' as a product of the concrete practices of the programmer within the framework afforded by the language. The analysis is broken down into three sections that discuss: the beginnings of the program's structure; the incremental development of structure; and finally the code productions that constitute the structure and the importance of the programmer's stock of knowledge. The discussion attempts to understand and describe the emerging structure of code rather than focus on generating `requirements' for supporting the production of that structure. Due to time and space constraints, however, only a relatively cursory examination of these features was possible. Secondly the paper presents some thoughts on the difficulties associated with the analytic---in particular ethnographic---study of code, drawing on general problems as well as issues arising from the difficulties and failings encountered as part of the analysis presented in the first section.
Resumo:
The flow rates of drying and nebulizing gas, heat block and desolvation line temperatures and interface voltage are potential electrospray ionization parameters as they may enhance sensitivity of the mass spectrometer. The conditions that give higher sensitivity of 13 pharmaceuticals were explored. First, Plackett-Burman design was implemented to screen significant factors, and it was concluded that interface voltage and nebulizing gas flow were the only factors that influence the intensity signal for all pharmaceuticals. This fractionated factorial design was projected to set a full 2(2) factorial design with center points. The lack-of-fit test proved to be significant. Then, a central composite face-centered design was conducted. Finally, a stepwise multiple linear regression and subsequently an optimization problem solving were carried out. Two main drug clusters were found concerning the signal intensities of all runs of the augmented factorial design. p-Aminophenol, salicylic acid, and nimesulide constitute one cluster as a result of showing much higher sensitivity than the remaining drugs. The other cluster is more homogeneous with some sub-clusters comprising one pharmaceutical and its respective metabolite. It was observed that instrumental signal increased when both significant factors increased with maximum signal occurring when both codified factors are set at level +1. It was also found that, for most of the pharmaceuticals, interface voltage influences the intensity of the instrument more than the nebulizing gas flowrate. The only exceptions refer to nimesulide where the relative importance of the factors is reversed and still salicylic acid where both factors equally influence the instrumental signal. Graphical Abstract ᅟ.
Resumo:
In this work, we use phylogenetic analyses to assess the putative origin of the Lisbon, Azorean, and Canarian populations. The identification of the origin of these three introduced populations is expected to provide insights into the invasion pattern of this species.
Resumo:
Hevea brasiliensis (Willd. Ex Adr. Juss.) Muell.-Arg. is the primary source of natural rubber that is native to the Amazon rainforest. The singular properties of natural rubber make it superior to and competitive with synthetic rubber for use in several applications. Here, we performed RNA sequencing (RNA-seq) of H. brasiliensis bark on the Illumina GAIIx platform, which generated 179,326,804 raw reads on the Illumina GAIIx platform. A total of 50,384 contigs that were over 400 bp in size were obtained and subjected to further analyses. A similarity search against the non-redundant (nr) protein database returned 32,018 (63%) positive BLASTx hits. The transcriptome analysis was annotated using the clusters of orthologous groups (COG), gene ontology (GO), Kyoto Encyclopedia of Genes and Genomes (KEGG), and Pfam databases. A search for putative molecular marker was performed to identify simple sequence repeats (SSRs) and single nucleotide polymorphisms (SNPs). In total, 17,927 SSRs and 404,114 SNPs were detected. Finally, we selected sequences that were identified as belonging to the mevalonate (MVA) and 2-C-methyl-D-erythritol 4-phosphate (MEP) pathways, which are involved in rubber biosynthesis, to validate the SNP markers. A total of 78 SNPs were validated in 36 genotypes of H. brasiliensis. This new dataset represents a powerful information source for rubber tree bark genes and will be an important tool for the development of microsatellites and SNP markers for use in future genetic analyses such as genetic linkage mapping, quantitative trait loci identification, investigations of linkage disequilibrium and marker-assisted selection.
Resumo:
Plackett-Burman experimental design was applied for the robustness assessment of GC×GC-qMS (Comprehensive Two-Dimensional Gas Chromatography with Fast Quadrupolar Mass Spectrometric Detection) in quantitative and qualitative analysis of volatiles compounds from chocolate samples isolated by headspace solid-phase microextraction (HS-SPME). The influence of small changes around the nominal level of six factors deemed as important on peak areas (carrier gas flow rate, modulation period, temperature of ionic source, MS photomultiplier power, injector temperature and interface temperature) and of four factors considered as potentially influential on spectral quality (minimum and maximum limits of the scanned mass ranges, ions source temperature and photomultiplier power). The analytes selected for the study were 2,3,5-trimethylpyrazine, 2-octanone, octanal, 2-pentyl-furan, 2,3,5,6-tetramethylpyrazine, and 2-nonanone e nonanal. The factors pointed out as important on the robustness of the system were photomultiplier power for quantitative analysis and lower limit of mass scanning range for qualitative analysis.
Resumo:
The use of screening techniques, such as an alternative light source (ALS), is important for finding biological evidence at a crime scene. The objective of this study was to evaluate whether biological fluid (blood, semen, saliva, and urine) deposited on different surfaces changes as a function of the age of the sample. Stains were illuminated with a Megamaxx™ ALS System and photographed with a Canon EOS Utility™ camera. Adobe Photoshop™ was utilized to prepare photographs for analysis, and then ImageJ™ was used to record the brightness values of pixels in the images. Data were submitted to analysis of variance using a generalized linear mixed model with two fixed effects (surface and fluid). Time was treated as a random effect (through repeated measures) with a first-order autoregressive covariance structure. Means of significant effects were compared by the Tukey test. The fluorescence of the analyzed biological material varied depending on the age of the sample. Fluorescence was lower when the samples were moist. Fluorescence remained constant when the sample was dry, up to the maximum period analyzed (60 days), independent of the substrate on which the fluid was deposited, showing the novelty of this study. Therefore, the forensic expert can detect biological fluids at the crime scene using an ALS even several days after a crime has occurred.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física