842 resultados para data movement problem


Relevância:

30.00% 30.00%

Publicador:

Resumo:

As scientific workflows and the data they operate on, grow in size and complexity, the task of defining how those workflows should execute (which resources to use, where the resources must be in readiness for processing etc.) becomes proportionally more difficult. While "workflow compilers", such as Pegasus, reduce this burden, a further problem arises: since specifying details of execution is now automatic, a workflow's results are harder to interpret, as they are partly due to specifics of execution. By automating steps between the experiment design and its results, we lose the connection between them, hindering interpretation of results. To reconnect the scientific data with the original experiment, we argue that scientists should have access to the full provenance of their data, including not only parameters, inputs and intermediary data, but also the abstract experiment, refined into a concrete execution by the "workflow compiler". In this paper, we describe preliminary work on adapting Pegasus to capture the process of workflow refinement in the PASOA provenance system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

XML has become an important medium for data exchange, and is frequently used as an interface to - i.e. a view of - a relational database. Although lots of work have been done on querying relational databases through XML views, the problem of updating relational databases through XML views has not received much attention. In this work, we give the rst steps towards solving this problem. Using query trees to capture the notions of selection, projection, nesting, grouping, and heterogeneous sets found throughout most XML query languages, we show how XML views expressed using query trees can be mapped to a set of corresponding relational views. Thus, we transform the problem of updating relational databases through XML views into a classical problem of updating relational databases through relational views. We then show how updates on the XML view are mapped to updates on the corresponding relational views. Existing work on updating relational views can then be leveraged to determine whether or not the relational views are updatable with respect to the relational updates, and if so, to translate the updates to the underlying relational database. Since query trees are a formal characterization of view de nition queries, they are not well suited for end-users. We then investigate how a subset of XQuery can be used as a top level language, and show how query trees can be used as an intermediate representation of view de nitions expressed in this subset.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper has two original contributions. First, we show that the present value model (PVM hereafter), which has a wide application in macroeconomics and fi nance, entails common cyclical feature restrictions in the dynamics of the vector error-correction representation (Vahid and Engle, 1993); something that has been already investigated in that VECM context by Johansen and Swensen (1999, 2011) but has not been discussed before with this new emphasis. We also provide the present value reduced rank constraints to be tested within the log-linear model. Our second contribution relates to forecasting time series that are subject to those long and short-run reduced rank restrictions. The reason why appropriate common cyclical feature restrictions might improve forecasting is because it finds natural exclusion restrictions preventing the estimation of useless parameters, which would otherwise contribute to the increase of forecast variance with no expected reduction in bias. We applied the techniques discussed in this paper to data known to be subject to present value restrictions, i.e. the online series maintained and up-dated by Shiller. We focus on three different data sets. The fi rst includes the levels of interest rates with long and short maturities, the second includes the level of real price and dividend for the S&P composite index, and the third includes the logarithmic transformation of prices and dividends. Our exhaustive investigation of several different multivariate models reveals that better forecasts can be achieved when restrictions are applied to them. Moreover, imposing short-run restrictions produce forecast winners 70% of the time for target variables of PVMs and 63.33% of the time when all variables in the system are considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyzes the demand and cost structure of the French market of academic journals, taking into account its intermediary role between researchers, who are both producers and consumers of knowledge. This two sidedness feature will echoes similar problems already observed in electronic markets – payment card systems, video game console etc - such as the chicken and egg problem, where readers won’t buy a journal if they do not expect its articles to be academically relevant and researchers, that live under the mantra “Publish or Perish”, will not submit to a journal with either limited public reach or weak reputation. After the merging of several databases, we estimate the aggregated nested logit demand system combined simultaneously with a cost function. We identify the structural parameters of this market and find that price elasticities of demand are quite large and margins relatively low, indicating that this industry experiences competitive constraints.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION With the advent of Web 2.0, social networking websites like Facebook, MySpace and LinkedIn have become hugely popular. According to (Nilsen, 2009), social networking websites have global1 figures of almost 250 millions unique users among the top five2, with the time people spend on those networks increasing 63% between 2007 and 2008. Facebook alone saw a massive growth of 566% in number of minutes in the same period of time. Furthermore their appeal is clear, they enable users to easily form persistent networks of friends with whom they can interact and share content. Users then use those networks to keep in touch with their current friends and to reconnect with old friends. However, online social network services have rapidly evolved into highly complex systems which contain a large amount of personally salient information derived from large networks of friends. Since that information varies from simple links to music, photos and videos, users not only have to deal with the huge amount of data generated by them and their friends but also with the fact that it‟s composed of many different media forms. Users are presented with increasing challenges, especially as the number of friends on Facebook rises. An example of a problem is when a user performs a simple task like finding a specific friend in a group of 100 or more friends. In that case he would most likely have to go through several pages and make several clicks till he finds the one he is looking for. Another example is a user with more than 100 friends in which his friends make a status update or another action per day, resulting in 10 updates per hour to keep up. That is plausible, especially since the change in direction of Facebook to rival with Twitter, by encouraging users to update their status as they do on Twitter. As a result, to better present the web of information connected to a user the use of better visualizations is essential. The visualizations used nowadays on social networking sites haven‟t gone through major changes during their lifetimes. They have added more functionality and gave more tools to their users, but still the core of their visualization hasn‟t changed. The information is still presented in a flat way in lists/groups of text and images which can‟t show the extra connections pieces of information. Those extra connections can give new meaning and insights to the user, allowing him to more easily see if that content is important to him and the information related to it. However showing extra connections of information but still allowing the user to easily navigate through it and get the needed information with a quick glance is difficult. The use of color coding, clusters and shapes becomes then essential to attain that objective. But taking into consideration the advances in computer hardware in the last decade and the software platforms available today, there is the opportunity to take advantage of 3D. That opportunity comes in because we are at a phase were the hardware and the software available is ready for the use of 3D in the web. With the use of the extra dimension brought by 3D, visualizations can be constructed to show the content and its related information to the user at the same screen and in a clear way. Also it would allow a great deal of interactivity. Another opportunity to create better information‟s visualization presents itself in the form of the open APIs, specifically the ones made available by the social networking sites. Those APIs allow any developers to create their own applications or sites taking advantage of the huge amount of information there is on those networks. Specifically to this case, they open the door for the creation of new social network visualizations. Nevertheless, the third dimension is by itself not enough to create a better interface for a social networking website, there are some challenges to overcome. One of those challenges is to make the user understand what the system is doing during the interaction with the user. Even though that is important in 2D visualizations, it becomes essential in 3D due to the extra dimension. To overcome that challenge it‟s necessary the use of the principles of animations defined by the artists at Walt Disney Studios (Johnston, et al., 1995). By applying those principles in the development of the interface, the actions of the system in response to the user inputs became clear and understandable. Furthermore, a user study needs to be performed so the users‟ main goals and motivations, while navigating the social network, are revealed. Their goals and motivations are important in the construction of an interface that reflects the user expectations for the interface, but also helps in the development of appropriate metaphors. Those metaphors have an important role in the interface, because if correctly chosen they help the user understand the elements of the interface instead of making him memorize it. The last challenge is the use of 3D visualization on the web, since there have been several attempts to bring 3D into it, mainly with the various versions of VRML which were destined to failure due to the hardware limitations at the time. However, in the last couple of years there has been a movement to make the necessary tools to finally allow developers to use 3D in a useful way, using X3D or OpenGL but especially flash. This thesis argues that there is a need for a better social network visualization that shows all the dimensions of the information connected to the user and that allows him to move through it. But there are several characteristics the new visualization has to possess in order for it to present a real gain in usability to Facebook‟s users. The first quality is to have the friends at the core of its design, and the second to make use of the metaphor of circles of friends to separate users in groups taking into consideration the order of friendship. To achieve that several methods have to be used, from the use of 3D to get an extra dimension for presenting relevant information, to the use of direct manipulation to make the interface comprehensible, predictable and controllable. Moreover animation has to be use to make all the action on the screen perceptible to the user. Additionally, with the opportunity given by the 3D enabled hardware, the flash platform, through the use of the flash engine Papervision3D and the Facebook platform, all is in place to make the visualization possible. But even though it‟s all in place, there are challenges to overcome like making the system actions in 3D understandable to the user and creating correct metaphors that would allow the user to understand the information and options available to him. This thesis document is divided in six chapters, with Chapter 2 reviewing the literature relevant to the work described in this thesis. In Chapter 3 the design stage that resulted in the application presented in this thesis is described. In Chapter 4, the development stage, describing the architecture and the components that compose the application. In Chapter 5 the usability test process is explained and the results obtained through it are presented and analyzed. To finish, Chapter 6 presents the conclusions that were arrived in this thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this thesis was to investigate the evolution of the socio-occupational status in Rio Grande do Norte from 2001 to 2008, based on the characterization of the socio-economic status of this State from the analysis of labor market norte-rio-grandense . The study, specifically, drew a comparison between the dynamics of the labor market in Rio Grande do Norte and the capital city, Natal. From this perspective, the purpose was to make a relationship between the social division of labor and its effects on the socio-spatial division, represented in the "macro scale" by the federal unit and the "micro level" for the capital; locus of economic and population concentration. The collection of data on the labor market had as a major source PNAD/IBGE, characterizing the labor market in many ways: people of working age, economically active population and employed and unemployed people, classified by age, sex, color, education, income and social protection condition. However, as for the socio-occupational division, we follow the methodology used by the research group on national television, based in IPPUR /UFRJ, called Monitoring of the Metropolis," which rallied twenty-four groups that aggregate the occupations found in the PNAD/IBGE, in eight groups of socio-occupational categories, according to the similarity between them. It was used in the socio-spatial cutting two relevant discussions, which are inter-related and were characterized as crucial points in developing the research problem: the former was related to the influence of the hegemony of merchant capital in the labor market in Rio Grande North and, the latter, it referred the socio-economic relations between the territory and the variable occupation. Lastly, the results all indicated that in Rio Grande do Norte, as a peripheral state, has suffered the devastating influence of the hegemony of capital purely commercial basis, where "wealth" of capitalism is generated through the sphere of mere movement of goods and services rather than a productive process due to the social relations of production more advanced. We have a little advanced economic structure, with a tertiary sector that has propagated under-employment or disguised unemployment. Similarly, the agricultural sector has been presented as an example of greater social degradation of working conditions in the state. The secondary sector, in turn, also was not behind this uncertainty; on the contrary, confirmed that condition, with poor levels of income, low education of the workforce and a high degree of social helplessness, even in the state capital, space full urban area, which although always appear with a favorable condition compared to Province, in practically most of the variables studied, was also reflected at the same time the author of a structurally underdeveloped condition

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUÇÃO: A malaria é uma doença endêmica na região da Amazônia Brasileira, e a detecção de possíveis fatores de risco pode ser de grande interesse às autoridades em saúde pública. O objetivo deste artigo é investigar a associação entre variáveis ambientais e os registros anuais de malária na região amazônica usando métodos bayesianos espaço-temporais. MÉTODOS: Utilizaram-se modelos de regressão espaço-temporais de Poisson para analisar os dados anuais de contagem de casos de malária entre os anos de 1999 a 2008, considerando a presença de alguns fatores como a taxa de desflorestamento. em uma abordagem bayesiana, as inferências foram obtidas por métodos Monte Carlo em cadeias de Markov (MCMC) que simularam amostras para a distribuição conjunta a posteriori de interesse. A discriminação de diferentes modelos também foi discutida. RESULTADOS: O modelo aqui proposto sugeriu que a taxa de desflorestamento, o número de habitants por km² e o índice de desenvolvimento humano (IDH) são importantes para a predição de casos de malária. CONCLUSÕES: É possível concluir que o desenvolvimento humano, o crescimento populacional, o desflorestamento e as alterações ecológicas associadas a estes fatores estão associados ao aumento do risco de malária. Pode-se ainda concluir que o uso de modelos de regressão de Poisson que capturam o efeito temporal e espacial em um enfoque bayesiano é uma boa estratégia para modelar dados de contagem de malária.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work summarizes the HdHr group of Hermitian integration algorithms for dynamic structural analysis applications. It proposes a procedure for their use when nonlinear terms are present in the equilibrium equation. The simple pendulum problem is solved as a first example and the numerical results are discussed. Directions to be pursued in future research are also mentioned. Copyright (C) 2009 H.M. Bottura and A. C. Rigitano.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

From the 1980s, with technological development, globalization, and in a context of increasingly urgent demands, there is an international movement to modernize the state structures. Being driven by the victory of conservative governments in Britain and the U.S., this speech reform comes only to Brazil in the 1990s, the government of Fernando Henrique Cardoso. Thus, in view of the recent movement of states to implement this reform agenda in their structures this research was to identify the elements that made it possible to attempt to modernize the administrative structure of the state of Piauí in 2003, in view of the political and administrative career in which the state was entered. Seeking to clarify the problem studied here, through a case study carried out a descriptive and exploratory, using a technique of gathering data to document research and interviews semi-structured. As the lens of analysis for this study used the neo-historical and sociological institutionalism, through which it sought to identify the critical moment in which they gave the Administrative Reform of Piaui, the process of breaking with the political and administrative career that previously had being followed, and the isomorphic mechanisms that enabled this speech reform comes up to this state, mechanisms that allow the homogenization of the organizational field. In general it appears that the search for new patterns and new technologies for management by the states in Brazil is due to the context of fiscal crisis in which the states were entered, forcing them to seek alternative models of management . The process of diffusion of New Public Management agenda for the states became possible, among other factors, due to the new scenario in which was inserted into the Brazilian federal system in the second half of the 1990s, characterized by a greater articulation between the horizontalstates, where through the mechanisms of isomorphic institutional change was made possible by the absorption of the speech states reformer of the 1990s. However, due to the specificities of each region is given the experiences state unevenly. In the case of Piauí Administrative Reform only became possible due to the rearrangement of political forces in the state and the mechanisms of isomorphic institutional change, which allowed, in 2003, the state government to absorb the speech reformer

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: The purpose of this study was to compare the dental movement that occurs during the processing of maxillary complete dentures with 3 different base thicknesses, using 2 investment methods, and microwave polymerization.Methods: A sample of 42 denture models was randomly divided into 6 groups (n = 7), with base thicknesses of 1.25, 2.50, and 3.75 mm and gypsum or silicone flask investment. Points were demarcated on the distal surface of the second molars and on the back of the gypsum cast at the alveolar ridge level to allow linear and angular measurement using AutoCAD software. The data were subjected to analysis of variance with double factor, Tukey test and Fisher (post hoc).Results: Angular analysis of the varying methods and their interactions generated a statistical difference (P = 0.023) when the magnitudes of molar inclination were compared. Tooth movement was greater for thin-based prostheses, 1.25 mm (-0.234), versus thick 3.75 mm (0.2395), with antagonistic behavior. Prosthesis investment with silicone (0.053) showed greater vertical change compared with the gypsum investment (0.032). There was a difference between the point of analysis, demonstrating that the changes were not symmetric.Conclusions: All groups evaluated showed change in the position of artificial teeth after processing. The complete denture with a thin base (1.25 mm) and silicone investment showed the worst results, whereas intermediate thickness (2.50 mm) was demonstrated to be ideal for the denture base.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We re-analyse the non-standard interaction (NSI) solutions to the solar neutrino problem in the light of the latest solar as well as atmospheric neutrino data. The latter require oscillations (OSC), while the former do not. Within such a three-neutrino framework the solar and atmospheric neutrino sectors are connected not only by the neutrino mixing angle theta(13) constrained by reactor and atmospheric data, but also by the flavour-changing (FC) and non-universal (NU) parameters accounting for the solar data. Since the NSI solution is energy-independent the spectrum is undistorted, so that the global analysis observables are the solar neutrino rates in all experiments as well as the Super-Kamiokande day-night measurements. We find that the NSI description of solar data is slightly better than that of the OSC solution and that the allowed NSI regions are determined mainly by the rate analysis. By using a few simplified ansatzes for the NSI interactions we explicitly demonstrate that the NSI values indicated by the solar data analysis are fully acceptable also for the atmospheric data. (C) 2002 Elsevier B.V. B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Models where the dark matter component of the Universe interacts with the dark energy field have been proposed as a solution to the cosmic coincidence problem, since in the attractor regime both dark energy and dark matter scale in the same way. In these models the mass of the cold dark matter particles is a function of the dark energy field responsible for the present acceleration of the Universe, and different scenarios can be parametrized by how the mass of the cold dark matter particles evolves with time. In this article we study the impact of a constant coupling delta between dark energy and dark matter on the determination of a redshift dependent dark energy equation of state w(DE)(z) and on the dark matter density today from SNIa data. We derive an analytical expression for the luminosity distance in this case. In particular, we show that the presence of such a coupling increases the tension between the cosmic microwave background data from the analysis of the shift parameter in models with constant w(DE) and SNIa data for realistic values of the present dark matter density fraction. Thus, an independent measurement of the present dark matter density can place constraints on models with interacting dark energy.