836 resultados para Data fusion applications
Resumo:
Current scientific applications are often structured as workflows and rely on workflow systems to compile abstract experiment designs into enactable workflows that utilise the best available resources. The automation of this step and of the workflow enactment, hides the details of how results have been produced. Knowing how compilation and enactment occurred allows results to be reconnected with the experiment design. We investigate how provenance helps scientists to connect their results with the actual execution that took place, their original experiment and its inputs and parameters.
Resumo:
Provenance refers to the past processes that brought about a given (version of an) object, item or entity. By knowing the provenance of data, users can often better understand, trust, reproduce, and validate it. A provenance-aware application has the functionality to answer questions regarding the provenance of the data it produces, by using documentation of past processes. PrIMe is a software engineering technique for adapting application designs to enable them to interact with a provenance middleware layer, thereby making them provenance-aware. In this article, we specify the steps involved in applying PrIMe, analyse its effectiveness, and illustrate its use with two case studies, in bioinformatics and medicine.
Resumo:
In this research the 3DVAR data assimilation scheme is implemented in the numerical model DIVAST in order to optimize the performance of the numerical model by selecting an appropriate turbulence scheme and tuning its parameters. Two turbulence closure schemes: the Prandtl mixing length model and the two-equation k-ε model were incorporated into DIVAST and examined with respect to their universality of application, complexity of solutions, computational efficiency and numerical stability. A square harbour with one symmetrical entrance subject to tide-induced flows was selected to investigate the structure of turbulent flows. The experimental part of the research was conducted in a tidal basin. A significant advantage of such laboratory experiment is a fully controlled environment where domain setup and forcing are user-defined. The research shows that the Prandtl mixing length model and the two-equation k-ε model, with default parameterization predefined according to literature recommendations, overestimate eddy viscosity which in turn results in a significant underestimation of velocity magnitudes in the harbour. The data assimilation of the model-predicted velocity and laboratory observations significantly improves model predictions for both turbulence models by adjusting modelled flows in the harbour to match de-errored observations. 3DVAR allows also to identify and quantify shortcomings of the numerical model. Such comprehensive analysis gives an optimal solution based on which numerical model parameters can be estimated. The process of turbulence model optimization by reparameterization and tuning towards optimal state led to new constants that may be potentially applied to complex turbulent flows, such as rapidly developing flows or recirculating flows.
Resumo:
It is well known that cointegration between the level of two variables (labeled Yt and yt in this paper) is a necessary condition to assess the empirical validity of a present-value model (PV and PVM, respectively, hereafter) linking them. The work on cointegration has been so prevalent that it is often overlooked that another necessary condition for the PVM to hold is that the forecast error entailed by the model is orthogonal to the past. The basis of this result is the use of rational expectations in forecasting future values of variables in the PVM. If this condition fails, the present-value equation will not be valid, since it will contain an additional term capturing the (non-zero) conditional expected value of future error terms. Our article has a few novel contributions, but two stand out. First, in testing for PVMs, we advise to split the restrictions implied by PV relationships into orthogonality conditions (or reduced rank restrictions) before additional tests on the value of parameters. We show that PV relationships entail a weak-form common feature relationship as in Hecq, Palm, and Urbain (2006) and in Athanasopoulos, Guillén, Issler and Vahid (2011) and also a polynomial serial-correlation common feature relationship as in Cubadda and Hecq (2001), which represent restrictions on dynamic models which allow several tests for the existence of PV relationships to be used. Because these relationships occur mostly with nancial data, we propose tests based on generalized method of moment (GMM) estimates, where it is straightforward to propose robust tests in the presence of heteroskedasticity. We also propose a robust Wald test developed to investigate the presence of reduced rank models. Their performance is evaluated in a Monte-Carlo exercise. Second, in the context of asset pricing, we propose applying a permanent-transitory (PT) decomposition based on Beveridge and Nelson (1981), which focus on extracting the long-run component of asset prices, a key concept in modern nancial theory as discussed in Alvarez and Jermann (2005), Hansen and Scheinkman (2009), and Nieuwerburgh, Lustig, Verdelhan (2010). Here again we can exploit the results developed in the common cycle literature to easily extract permament and transitory components under both long and also short-run restrictions. The techniques discussed herein are applied to long span annual data on long- and short-term interest rates and on price and dividend for the U.S. economy. In both applications we do not reject the existence of a common cyclical feature vector linking these two series. Extracting the long-run component shows the usefulness of our approach and highlights the presence of asset-pricing bubbles.
Resumo:
This paper illustrates the use of the marginal cost of public funds concept in three contexts. First, we extend Parry’s (2003) analysis of the efficiency effects excise taxes in the U.K., primarily by incorporating the distortion caused by imperfect competition in the cigarette market and distinguishing between the MCFs for per unit and ad valorem taxes on cigarettes. Our computations show, contrary to the standard result in the literature, that the per unit tax on cigarettes has a slightly lower MCF than the ad valorem tax on cigarettes. Second, we calculate the MCF for a payroll tax in a labour market with involuntary unemployment, using the Shapiro and Stiglitz (1984) efficiency wage model as our framework. Our computations, based on Canadian labour market data, indicate that incorporating the distortion caused by involuntary unemployment raises the MCF by 25 to 50 percent. Third, we derive expressions for the distributionally-weighted MCFs for the exemption level and the marginal tax rate for a “flat tax”, such as the one that has been adopted by the province of Alberta. This allows us to develop a restricted, but tractable, version of the optimal income tax problem. Computations indicate that the optimal marginal tax rate may be quite high, even with relatively modest pro-poor distributional preferences.
Resumo:
Online geographic-databases have been growing increasingly as they have become a crucial source of information for both social networks and safety-critical systems. Since the quality of such applications is largely related to the richness and completeness of their data, it becomes imperative to develop adaptable and persistent storage systems, able to make use of several sources of information as well as enabling the fastest possible response from them. This work will create a shared and extensible geographic model, able to retrieve and store information from the major spatial sources available. A geographic-based system also has very high requirements in terms of scalability, computational power and domain complexity, causing several difficulties for a traditional relational database as the number of results increases. NoSQL systems provide valuable advantages for this scenario, in particular graph databases which are capable of modeling vast amounts of inter-connected data while providing a very substantial increase of performance for several spatial requests, such as finding shortestpath routes and performing relationship lookups with high concurrency. In this work, we will analyze the current state of geographic information systems and develop a unified geographic model, named GeoPlace Explorer (GE). GE is able to import and store spatial data from several online sources at a symbolic level in both a relational and a graph databases, where several stress tests were performed in order to find the advantages and disadvantages of each database paradigm.
Resumo:
A study was carried out to elaborate response surface models using broiler performance data recovered from literature in order to predict performance and elaborate economic analyses. Nineteen studies published between 1995 and 2005 were retrieved using the systematic literature review method. Weight gain and feed conversion data were collected from eight studies that fulfilled the pre-established inclusion criteria, and a response surface model was adjusted using crude protein, environmental temperature, and age as independent variables. The models produced for weight gain (r² = 0.93) and feed conversion (r² = 0.85) were accurate, precise, and not biased. Protein levels, environmental temperature and age showed linear and quadratic effects on weight gain and feed conversion. There was no interaction between protein level and environmental temperature. Age and crude protein showed interaction for weight gain and feed conversion, whereas interaction between age and temperature was detected only for weight gain. It was possible to perform economic analyses to determine maximum profit as a function of the variables that were included in the model. It was concluded that the response surface models are effective to predict the performance of broiler chickens and allow the elaboration of economic analyses to optimize profit.
Resumo:
The synthesis of a poly(azo)urethane by fixing CO2 in bis-epoxide followed by a polymerization reaction with an azodiamine is presented. Since isocyanate is not used in the process, it is termed clean method and the polymers obtained are named NIPUs (non-isocyanate polyurethanes). Langmuir films were formed at the air-water interface and were characterized by surface pressure vs mean molecular area per met unit (Pi-A) isotherms. The Langmuir monolayers were further studied by running stability tests and cycles of compression/expansion (possible hysteresis) and by varying the compression speed of the monolayer formation, the subphase temperature, and the solvents used to prepare the spreading polymer solutions. The Langmuir-Blodgett (LB) technique was used to fabricate ultrathin films of a particular polymer (PAzoU). It is possible to grow homogeneous LB films of up to 15 layers as monitored using UV-vis absorption spectroscopy. Higher number of layers can be deposited when PAzoU is mixed with stearic acid, producing mixed LB films. Fourier transform infrared (FTIR) absorption spectroscopy and Raman scattering showed that the materials do not interact chemically in the mixed LB films. The atomic force microscopy (AFM) and micro-Raman technique (optical microscopy coupled to Raman spectrograph) revealed that mixed LB films present a phase separation distinguishable at micrometer or nanometer scale. Finally, mixed and neat LB films were successfully characterized using impedance spectroscopy at different temperatures, a property that may lead to future application as temperature sensors. Principal component analysis (PCA) was used to correlate the data.
Resumo:
Neural networks and wavelet transform have been recently seen as attractive tools for developing eficient solutions for many real world problems in function approximation. Function approximation is a very important task in environments where computation has to be based on extracting information from data samples in real world processes. So, mathematical model is a very important tool to guarantee the development of the neural network area. In this article we will introduce one series of mathematical demonstrations that guarantee the wavelets properties for the PPS functions. As application, we will show the use of PPS-wavelets in pattern recognition problems of handwritten digit through function approximation techniques.
Resumo:
Service provisioning is a challenging research area for the design and implementation of autonomic service-oriented software systems. It includes automated QoS management for such systems and their applications. Monitoring, Diagnosis and Repair are three key features of QoS management. This work presents a self-healing Web service-based framework that manages QoS degradation at runtime. Our approach is based on proxies. Proxies act on meta-level communications and extend the HTTP envelope of the exchanged messages with QoS-related parameter values. QoS Data are filtered over time and analysed using statistical functions and the Hidden Markov Model. Detected QoS degradations are handled with proxies. We experienced our framework using an orchestrated electronic shop application (FoodShop).
Resumo:
Ordered intermetallic phases of Pt with several transition metals have been prepared and their electrocatalytic properties studied. In light of these tests it is proposed that these catalysts could be used as electrodes in fuel cells, as they combine an excellent capacity to adsorb organic fuels at the Pt sites with low susceptibility to being poisoned by intermediates and reaction products at the transition-metal sites. An experimental procedure used to obtain the four intermetallic phases Pt-M (M = Mn, Pb, Sb and Sn) is described. The phases thus produced were characterized by X-ray diffraction, scanning electron microscopy with surface analysis by energy-dispersive X-ray spectrometry, scanning tunneling microscopy and X-ray photoelectron spectroscopy. The data thus obtained support the conclusion that the method described here is highly effective for the preparation of Pt-M phases featuring a range of structural and electronic modifications that will allow a useful relation to be established between their physicochemical properties and predicted electrocatalytic activity. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Digital radiography in the inspection of welded pipes to be installed under deep water offshore gas and oil pipelines, like a presalt in Brazil, in the paper has been investigated. The aim is to use digital radiography for nondestructive testing of welds as it is already in use in the medical, aerospace, security, automotive, and petrochemical sectors. Among the current options, the DDA (Digital Detector Array) is considered as one of the best solutions to replace industrial films, as well as to increase the sensitivity to reduce the inspection cycle time. This paper shows the results of this new technique, comparing it to radiography with industrial films systems. In this paper, 20 test specimens of longitudinal welded pipe joints, specially prepared with artificial defects like cracks, lack of fusion, lack of penetration, and porosities and slag inclusions with varying dimensions and in 06 different base metal wall thicknesses, were tested and a comparison of the techniques was made. These experiments verified the purposed rules for parameter definitions and selections to control the required digital radiographic image quality as described in the draft international standard ISO/DIS 10893-7. This draft is first standard establishing the parameters for digital radiography on weld seam of welded steel pipes for pressure purposes to be used on gas and oil pipelines.
Resumo:
This paper presents a new static model for tubular fluorescent lamps (T12 bulb) operated at high frequencies. The main goal of this paper is to investigate the effects of ambient temperature and nominal switching frequency of operation in the static characteristics of tubular fluorescent lamps. The methodology for obtaining the model is based on several two-dimensional mathematical regressions, used to provide the behavior of the fluorescent lamp according to different independent variables, namely: power processed through the lamp and ambient temperature. In addition, the proposed model can be easily converted to a lamp equivalent resistance model, which can be useful for ballast designers. Finally, the curves obtained using the new model are compared to the correspondent experimental data, in order to verify the accuracy of the proposed methodology.
Resumo:
To evaluate the trans-enamel and trans-dentinal cytotoxic effects of a 35% H2O2 bleaching gel on an odontoblast-like cell lines (MDPC-23) after consecutive applications.Fifteen enamel/dentine discs were obtained from bovine central incisor teeth and placed individually in artificial pulp chambers. Three groups (n = 5 discs) were formed according to the following enamel treatments: G1: 35% H2O2 bleaching gel (15 min); G2: 35% H2O2 bleaching gel (15 min) + halogen light (20 s); G3: control (no treatment). After repeating the treatments three consecutive times, the extracts (culture medium + gel components that had diffused through enamel/dentine discs) in contact with the dentine were collected and applied to previously cultured MDPC-23 cells (50 000 cells cm(-2)) for 24 h. Cell metabolism was evaluated by the MTT assay and data were analysed statistically (alpha = 5%; Kruskal-Wallis and Mann-Whitney U-test). Cell morphology was analysed by scanning electron microscopy.Cell metabolism decreased by 92.03% and 82.47% in G1 and G2 respectively. G1 and G2 differed significantly (P < 0.05) from G3. Regardless of halogen light activation, the application of the bleaching gel on the cultured odontoblast-like cells caused significantly more severe cytotoxic effects than those observed in the nontreated control group. In addition, significant morphological cell alterations were observed in G1 and G2.After three consecutive applications of a 35% H2O2 bleaching agent, the diffusion of the gel components through enamel and dentine caused severe toxic effects to cultured pulp cells.