929 resultados para source code analysis
Resumo:
Recent studies have shown that the haemodynamic responses to brief (<2 secs) stimuli can be well characterised as a linear convolution of neural activity with a suitable haemodynamic impulse response. In this paper, we show that the linear convolution model cannot predict measurements of blood flow responses to stimuli of longer duration (>2 secs), regardless of the impulse response function chosen. Modifying the linear convolution scheme to a nonlinear convolution scheme was found to provide a good prediction of the observed data. Whereas several studies have found a nonlinear coupling between stimulus input and blood flow responses, the current modelling scheme uses neural activity as an input, and thus implies nonlinearity in the coupling between neural activity and blood flow responses. Neural activity was assessed by current source density analysis of depth-resolved evoked field potentials, while blood flow responses were measured using laser Doppler flowmetry. All measurements were made in rat whisker barrel cortex after electrical stimulation of the whisker pad for 1 to 16 secs at 5 Hz and 1.2 mA (individual pulse width 0.3 ms).
Resumo:
This article investigates the relation between stimulus-evoked neural activity and cerebral hemodynamics. Specifically, the hypothesis is tested that hemodynamic responses can be modeled as a linear convolution of experimentally obtained measures of neural activity with a suitable hemodynamic impulse response function. To obtain a range of neural and hemodynamic responses, rat whisker pad was stimulated using brief (less than or equal to2 seconds) electrical stimuli consisting of single pulses (0.3 millisecond, 1.2 mA) combined both at different frequencies and in a paired-pulse design. Hemodynamic responses were measured using concurrent optical imaging spectroscopy and laser Doppler flowmetry, whereas neural responses were assessed through current source density analysis of multielectrode recordings from a single barrel. General linear modeling was used to deconvolve the hemodynamic impulse response to a single "neural event" from the hemodynamic and neural responses to stimulation. The model provided an excellent fit to the empirical data. The implications of these results for modeling schemes and for physiologic systems coupling neural and hemodynamic activity are discussed.
Energy exchange in a dense urban environment Part II: impact of spatial heterogeneity of the surface
Resumo:
The centre of cities, characterised by spatial and temporal complexity, are challenging environments for micrometeorological research. This paper considers the impact of sensor location and heterogeneity of the urban surface on flux observations in the dense city centre of London, UK. Data gathered at two sites in close vicinity, but with different measurement heights, were analysed to investigate the influence of source area characteristics on long-term radiation and turbulent heat fluxes. Combining consideration of diffuse radiation and effects of specular reflections, the non-Lambertian urban surface is found to impact the measurements of surface albedo. Comparisons of observations from the two sites reveal that turbulent heat fluxes are similar under some flow conditions. However, they mostly observe processes at different scales due to their differing measurement heights, highlighting the critical impact of siting sensors in urban areas. A detailed source area analysis is presented to investigate the surface controls influencing the energy exchanges at the different scales
Resumo:
Performance modelling is a useful tool in the lifeycle of high performance scientific software, such as weather and climate models, especially as a means of ensuring efficient use of available computing resources. In particular, sufficiently accurate performance prediction could reduce the effort and experimental computer time required when porting and optimising a climate model to a new machine. In this paper, traditional techniques are used to predict the computation time of a simple shallow water model which is illustrative of the computation (and communication) involved in climate models. These models are compared with real execution data gathered on AMD Opteron-based systems, including several phases of the U.K. academic community HPC resource, HECToR. Some success is had in relating source code to achieved performance for the K10 series of Opterons, but the method is found to be inadequate for the next-generation Interlagos processor. The experience leads to the investigation of a data-driven application benchmarking approach to performance modelling. Results for an early version of the approach are presented using the shallow model as an example.
Resumo:
This paper details a strategy for modifying the source code of a complex model so that the model may be used in a data assimilation context, {and gives the standards for implementing a data assimilation code to use such a model}. The strategy relies on keeping the model separate from any data assimilation code, and coupling the two through the use of Message Passing Interface (MPI) {functionality}. This strategy limits the changes necessary to the model and as such is rapid to program, at the expense of ultimate performance. The implementation technique is applied in different models with state dimension up to $2.7 \times 10^8$. The overheads added by using this implementation strategy in a coupled ocean-atmosphere climate model are shown to be an order of magnitude smaller than the addition of correlated stochastic random errors necessary for some nonlinear data assimilation techniques.
Resumo:
For users of climate services, the ability to quickly determine the datasets that best fit one's needs would be invaluable. The volume, variety and complexity of climate data makes this judgment difficult. The ambition of CHARMe ("Characterization of metadata to enable high-quality climate services") is to give a wider interdisciplinary community access to a range of supporting information, such as journal articles, technical reports or feedback on previous applications of the data. The capture and discovery of this "commentary" information, often created by data users rather than data providers, and currently not linked to the data themselves, has not been significantly addressed previously. CHARMe applies the principles of Linked Data and open web standards to associate, record, search and publish user-derived annotations in a way that can be read both by users and automated systems. Tools have been developed within the CHARMe project that enable annotation capability for data delivery systems already in wide use for discovering climate data. In addition, the project has developed advanced tools for exploring data and commentary in innovative ways, including an interactive data explorer and comparator ("CHARMe Maps") and a tool for correlating climate time series with external "significant events" (e.g. instrument failures or large volcanic eruptions) that affect the data quality. Although the project focuses on climate science, the concepts are general and could be applied to other fields. All CHARMe system software is open-source, released under a liberal licence, permitting future projects to re-use the source code as they wish.
Resumo:
A numerical algorithm for fully dynamical lubrication problems based on the Elrod-Adams formulation of the Reynolds equation with mass-conserving boundary conditions is described. A simple but effective relaxation scheme is used to update the solution maintaining the complementarity conditions on the variables that represent the pressure and fluid fraction. The equations of motion are discretized in time using Newmark`s scheme, and the dynamical variables are updated within the same relaxation process just mentioned. The good behavior of the proposed algorithm is illustrated in two examples: an oscillatory squeeze flow (for which the exact solution is available) and a dynamically loaded journal bearing. This article is accompanied by the ready-to-compile source code with the implementation of the proposed algorithm. [DOI: 10.1115/1.3142903]
Resumo:
Det mobila operativsystemet Android är idag ett ganska dominerande operativsystem på den mobila marknaden dels på grund av sin öppenhet men också på grund av att tillgängligheten är stor i och med både billiga och dyra telefoner finns att tillgå. Men idag har Android inget fördefinierat designmönster vilket leder till att varje utvecklare får bestämma själv vad som ska användas, vilket ibland kan leda till onödigt komplex kod i applikationerna som sen blir svårtestad och svårhanterlig. Detta arbete ämnar jämföra två designmönster, Passive Model View Controller (PMVC) och Model View View-Model (MVVM), för att se vilket designmönster som blir minst komplext med hjälp av att räkna fram mätvärden med hjälp av Cyclomatic Complexity Number (CCN). Studien är gjord utifrån arbetssättet Design & Creation och ämnar bidra med: kunskap om vilket mönster man bör välja, samt om CCN kan peka ut vilka delar i en applikation som kommer att ta mer eller mindre lång tid att testa. Under studiens gång tog vi även fram skillnader på om man anväder sig av den så kallade Single Responsibilyt Principle (SRP) eller inte. Detta för att se om separerade vyer gör någon skillnad i applikationernas komplexitet. I slutändan så visar studien på att komplexiteten i små applikationer är väldigt likvärdig, men att man även på små applikationer kan se skillnad på hur komplex koden är men också att kodkomplexitet på metodnivå kan ge riktlinjer för testfall.
Resumo:
In this work, spoke about the importance of image compression for the industry, it is known that processing and image storage is always a challenge in petrobrás to optimize the storage time and store a maximum number of images and data. We present an interactive system for processing and storing images in the wavelet domain and an interface for digital image processing. The proposal is based on the Peano function and wavelet transform in 1D. The storage system aims to optimize the computational space, both for storage and for transmission of images. Being necessary to the application of the Peano function to linearize the images and the 1D wavelet transform to decompose it. These applications allow you to extract relevant information for the storage of an image with a lower computational cost and with a very small margin of error when comparing the images, original and processed, ie, there is little loss of quality when applying the processing system presented . The results obtained from the information extracted from the images are displayed in a graphical interface. It is through the graphical user interface that the user uses the files to view and analyze the results of the programs directly on the computer screen without the worry of dealing with the source code. The graphical user interface, programs for image processing via Peano Function and Wavelet Transform 1D, were developed in Java language, allowing a direct exchange of information between them and the user
Resumo:
The present work is characterized as a research-formation study. The author analyses his trajectory as dance professor, observing processes of transition in the perception of the body: from the mechanical body to the sensitive body. He tries to outstand this new meaning of the body and the dance teaching and artistic experience as the matter that instructs itself. This research puts together the experience of two teachers, one of them as student (researcher), while the other, as master and professor (collaborator) and intends to comprehend how this new meaning of the body was brought to each one s life, motivated by the dance. It is used the self biographic method and the research-formation methodology to analyze and identify common points between their self formation processes. The researcher and collaborator life narratives as well as a partially structured interview with the collaborator were used as investigation source. The analysis followed the models suggested by Schütze (1977), presented by Bauer and Jovchelovitch (2004), guided by the five pillars of the study: the Subject aspect, as guiding point for the analysis; Corporal aspect, as component and integrant element of an individual and of the dance; the Dance while seen as forming and guiding practice for the individuals researched; the Complexity aspect; and finally the Instructor and Professional Formation, emphasizing the self formation process. The results showed how the dance has changed their perception of their own bodies and the whole corporal aspect, leading to subject-actor body point of view, and no longer from a strictly mechanic perspective. The teaching trajectory was defined by the new evaluation of the body through the Dance bringing the individuals researched to a dialogical-reflexive teaching practice that motivates self consciousness, humanization and automatization, in the context of their background experiences and the environment they act
Resumo:
Global Positioning System, or simply GPS, it is a radionavigation system developed by United States for military applications, but it becames very useful for civilian using. In the last decades Brazil has developed sounding rockets and today many projects to build micro and nanosatellites has appeared. This kind of vehicles named spacecrafts or high dynamic vehicles, can use GPS for its autonome location and trajectories controls. Despite of a huge number of GPS receivers available for civilian applications, they cannot used in high dynamic vehicles due environmental issues (vibrations, temperatures, etc.) or imposed dynamic working limits. Only a few nations have the technology to build GPS receivers for spacecrafts or high dynamic vehicles is available and they imposes rules who difficult the access to this receivers. This project intends to build a GPS receiver, to install them in a payload of a sounding rocket and data collecting to verify its correct operation when at the flight conditions. The inner software to this receiver was available in source code and it was tested in a software development platform named GPS Architect. Many organizations cooperated to support this project: AEB, UFRN, IAE, INPE e CLBI. After many phases: defining working conditions, choice and searching electronic, the making of the printed boards, assembling and assembling tests; the receiver was installed in a VS30 sounding rocket launched at Centro de Lançamento da Barreira do Inferno in Natal/RN. Despite of the fact the locations data from the receiver were collected only the first 70 seconds of flight, this data confirms the correct operation of the receiver by the comparison between its positioning data and the the trajectory data from CLBI s tracking radar named ADOUR
Resumo:
Model-oriented strategies have been used to facilitate products customization in the software products lines (SPL) context and to generate the source code of these derived products through variability management. Most of these strategies use an UML (Unified Modeling Language)-based model specification. Despite its wide application, the UML-based model specification has some limitations such as the fact that it is essentially graphic, presents deficiencies regarding the precise description of the system architecture semantic representation, and generates a large model, thus hampering the visualization and comprehension of the system elements. In contrast, architecture description languages (ADLs) provide graphic and textual support for the structural representation of architectural elements, their constraints and interactions. This thesis introduces ArchSPL-MDD, a model-driven strategy in which models are specified and configured by using the LightPL-ACME ADL. Such strategy is associated to a generic process with systematic activities that enable to automatically generate customized source code from the product model. ArchSPLMDD strategy integrates aspect-oriented software development (AOSD), modeldriven development (MDD) and SPL, thus enabling the explicit modeling as well as the modularization of variabilities and crosscutting concerns. The process is instantiated by the ArchSPL-MDD tool, which supports the specification of domain models (the focus of the development) in LightPL-ACME. The ArchSPL-MDD uses the Ginga Digital TV middleware as case study. In order to evaluate the efficiency, applicability, expressiveness, and complexity of the ArchSPL-MDD strategy, a controlled experiment was carried out in order to evaluate and compare the ArchSPL-MDD tool with the GingaForAll tool, which instantiates the process that is part of the GingaForAll UML-based strategy. Both tools were used for configuring the products of Ginga SPL and generating the product source code
Resumo:
On the last years, several middleware platforms for Wireless Sensor Networks (WSN) were proposed. Most of these platforms does not consider issues of how integrate components from generic middleware architectures. Many requirements need to be considered in a middleware design for WSN and the design, in this case, it is possibility to modify the source code of the middleware without changing the external behavior of the middleware. Thus, it is desired that there is a middleware generic architecture that is able to offer an optimal configuration according to the requirements of the application. The adoption of middleware based in component model consists of a promising approach because it allows a better abstraction, low coupling, modularization and management features built-in middleware. Another problem present in current middleware consists of treatment of interoperability with external networks to sensor networks, such as Web. Most current middleware lacks the functionality to access the data provided by the WSN via the World Wide Web in order to treat these data as Web resources, and they can be accessed through protocols already adopted the World Wide Web. Thus, this work presents the Midgard, a component-based middleware specifically designed for WSNs, which adopts the architectural patterns microkernel and REST. The microkernel architectural complements the component model, since microkernel can be understood as a component that encapsulates the core system and it is responsible for initializing the core services only when needed, as well as remove them when are no more needed. Already REST defines a standardized way of communication between different applications based on standards adopted by the Web and enables him to treat WSN data as web resources, allowing them to be accessed through protocol already adopted in the World Wide Web. The main goals of Midgard are: (i) to provide easy Web access to data generated by WSN, exposing such data as Web resources, following the principles of Web of Things paradigm and (ii) to provide WSN application developer with capabilities to instantiate only specific services required by the application, thus generating a customized middleware and saving node resources. The Midgard allows use the WSN as Web resources and still provide a cohesive and weakly coupled software architecture, addressing interoperability and customization. In addition, Midgard provides two services needed for most WSN applications: (i) configuration and (ii) inspection and adaptation services. New services can be implemented by others and easily incorporated into the middleware, because of its flexible and extensible architecture. According to the assessment, the Midgard provides interoperability between the WSN and external networks, such as web, as well as between different applications within a single WSN. In addition, we assessed the memory consumption, the application image size, the size of messages exchanged in the network, and response time, overhead and scalability on Midgard. During the evaluation, the Midgard proved satisfies their goals and shown to be scalable without consuming resources prohibitively
Resumo:
The Exception Handling (EH) is a widely used mechanism for building robust systems. In Software Product Line (SPL) context it is not different. As EH mechanisms are embedded in most of mainstream programming languages (like Java, C# and C++), we can find exception signalers and handlers spread over code assets associated to common and variable SPL features. When exception signalers and handlers are added to an SPL in an unplanned way, one of the possible consequences is the generation of faulty family instances (i.e., instances on which common or variable features signal exceptions that are mistakenly caught inside the system). In this context, some questions arise: How exceptions flow between the optional and alternative features an LPS? Aiming at providing answers to these questions, this master thesis conducted an exploratory study, based on code inspection and static analysis code, whose goal was to categorize the main ways which exceptions flow in LPSs. To support the study, we developed an static analysis tool called PLEA (Product Line Exception Analyzer) that calculates the exceptional flows of LPSs, and categorize these flows according to the features associated with handlers and signalers. Preliminary results showed that some types of exceptional flows have more potential to yield failures in exceptional behavior of SLPs
Resumo:
Software Products Lines (SPL) is a software engineering approach to developing software system families that share common features and differ in other features according to the requested software systems. The adoption of the SPL approach can promote several benefits such as cost reduction, product quality, productivity, and time to market. On the other hand, the SPL approach brings new challenges to the software evolution that must be considered. Recent research work has explored and proposed automated approaches based on code analysis and traceability techniques for change impact analysis in the context of SPL development. There are existing limitations concerning these approaches such as the customization of the analysis functionalities to address different strategies for change impact analysis, and the change impact analysis of fine-grained variability. This dissertation proposes a change impact analysis tool for SPL development, called Squid Impact Analyzer. The tool allows the implementation of change impact analysis based on information from variability modeling, mapping of variability to code assets, and existing dependency relationships between code assets. An assessment of the tool is conducted through an experiment that compare the change impact analysis results provided by the tool with real changes applied to several evolution releases from a SPL for media management in mobile devices