17 resultados para Statistical modeling technique
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
Intense precipitation events (IPE) have been causing great social and economic losses in the affected regions. In the Amazon, these events can have serious impacts, primarily for populations living on the margins of its countless rivers, because when water levels are elevated, floods and/or inundations are generally observed. Thus, the main objective of this research is to study IPE, through Extreme Value Theory (EVT), to estimate return periods of these events and identify regions of the Brazilian Amazon where IPE have the largest values. The study was performed using daily rainfall data of the hydrometeorological network managed by the National Water Agency (Agência Nacional de Água) and the Meteorological Data Bank for Education and Research (Banco de Dados Meteorológicos para Ensino e Pesquisa) of the National Institute of Meteorology (Instituto Nacional de Meteorologia), covering the period 1983-2012. First, homogeneous rainfall regions were determined through cluster analysis, using the hierarchical agglomerative Ward method. Then synthetic series to represent the homogeneous regions were created. Next EVT, was applied in these series, through Generalized Extreme Value (GEV) and the Generalized Pareto Distribution (GPD). The goodness of fit of these distributions were evaluated by the application of the Kolmogorov-Smirnov test, which compares the cumulated empirical distributions with the theoretical ones. Finally, the composition technique was used to characterize the prevailing atmospheric patterns for the occurrence of IPE. The results suggest that the Brazilian Amazon has six pluvial homogeneous regions. It is expected more severe IPE to occur in the south and in the Amazon coast. More intense rainfall events are expected during the rainy or transitions seasons of each sub-region, with total daily precipitation of 146.1, 143.1 and 109.4 mm (GEV) and 201.6, 209.5 and 152.4 mm (GPD), at least once year, in the south, in the coast and in the northwest of the Brazilian Amazon, respectively. For the south Amazonia, the composition analysis revealed that IPE are associated with the configuration and formation of the South Atlantic Convergence Zone. Along the coast, intense precipitation events are associated with mesoscale systems, such Squall Lines. In Northwest Amazonia IPE are apparently associated with the Intertropical Convergence Zone and/or local convection.
Análise de volatilidade, integração de preços e previsibilidade para o mercado brasileiro de camarão
Resumo:
The present paper has the purpose of investigate the dynamics of the volatility structure in the shrimp prices in the Brazilian fish market. Therefore, a description of the initial aspects of the shrimp price series was made. From this information, statistics tests were made and selected univariate models to be price predictors. Then, it was verified the existence of relationship of long-term equilibrium between the Brazilian and American imported shrimp and if, confirmed the relationship, whether or not there is a causal link between these assets, considering that the two countries had presented trade relations over the years. It is presented as an exploratory research of applied nature with quantitative approach. The database was collected through direct contact with the Companhia de Entrepostos e Armazéns Gerais de São Paulo (CEAGESP) and on the official website of American import, National Marine Fisheries Service - National Oceanic and Atmospheric Administration (NMFS- NOAA). The results showed that the great variability in the active price is directly related with the gain and loss of the market agents. The price series presents a strong seasonal and biannual effect. The average structure of price of shrimp in the last 12 years was R$ 11.58 and external factors besides the production and marketing (U.S. antidumping, floods and pathologies) strongly affected the prices. Among the tested models for predicting prices of shrimp, four were selected, which through the prediction methodologies of one step forward of horizon 12, proved to be statistically more robust. It was found that there is weak evidence of long-term equilibrium between the Brazilian and American shrimp, where equivalently, was not found a causal link between them. We concluded that the dynamic pricing of commodity shrimp is strongly influenced by external productive factors and that these phenomena cause seasonal effects in the prices. There is no relationship of long-term stability between the Brazilian and American shrimp prices, but it is known that Brazil imports USA production inputs, which somehow shows some dependence productive. To the market agents, the risk of interferences of the external prices cointegrated to Brazilian is practically inexistent. Through statistical modeling is possible to minimize the risk and uncertainty embedded in the fish market, thus, the sales and marketing strategies for the Brazilian shrimp can be consolidated and widespread
Resumo:
In this work we present a new clustering method that groups up points of a data set in classes. The method is based in a algorithm to link auxiliary clusters that are obtained using traditional vector quantization techniques. It is described some approaches during the development of the work that are based in measures of distances or dissimilarities (divergence) between the auxiliary clusters. This new method uses only two a priori information, the number of auxiliary clusters Na and a threshold distance dt that will be used to decide about the linkage or not of the auxiliary clusters. The number os classes could be automatically found by the method, that do it based in the chosen threshold distance dt, or it is given as additional information to help in the choice of the correct threshold. Some analysis are made and the results are compared with traditional clustering methods. In this work different dissimilarities metrics are analyzed and a new one is proposed based on the concept of negentropy. Besides grouping points of a set in classes, it is proposed a method to statistical modeling the classes aiming to obtain a expression to the probability of a point to belong to one of the classes. Experiments with several values of Na e dt are made in tests sets and the results are analyzed aiming to study the robustness of the method and to consider heuristics to the choice of the correct threshold. During this work it is explored the aspects of information theory applied to the calculation of the divergences. It will be explored specifically the different measures of information and divergence using the Rényi entropy. The results using the different metrics are compared and commented. The work also has appendix where are exposed real applications using the proposed method
Resumo:
The activity of requirements engineering is seen in agile methods as bureaucratic activity making the process less agile. However, the lack of documentation in agile development environment is identified as one of the main challenges of the methodology. Thus, it is observed that there is a contradiction between what agile methodology claims and the result, which occurs in the real environment. For example, in agile methods the user stories are widely used to describe requirements. However, this way of describing requirements is still not enough, because the user stories is an artifact too narrow to represent and detail the requirements. The activities of verifying issues like software context and dependencies between stories are also limited with the use of only this artifact. In the context of requirements engineering there are goal oriented approaches that bring benefits to the requirements documentation, including, completeness of requirements, analysis of alternatives and support to the rationalization of requirements. Among these approaches, it excels the i * modeling technique that provides a graphical view of the actors involved in the system and their dependencies. This work is in the context of proposing an additional resource that aims to reduce this lack of existing documentation in agile methods. Therefore, the objective of this work is to provide a graphical view of the software requirements and their relationships through i * models, thus enriching the requirements in agile methods. In order to do so, we propose a set of heuristics to perform the mapping of the requirements presented as user stories in i * models. These models can be used as a form of documentation in agile environment, because by mapping to i * models, the requirements will be viewed more broadly and with their proper relationships according to the business environment that they will meet
Resumo:
Drilling fluids have fundamental importance in the petroleum activities, since they are responsible for remove the cuttings, maintain pressure and well stability, preventing collapse and inflow of fluid into the rock formation and maintain lubrication and cooling the drill. There are basically three types of drilling fluids: water-based, non-aqueous and aerated based. The water-based drilling fluid is widely used because it is less aggressive to the environment and provide excellent stability and inhibition (when the water based drilling fluid is a inhibition fluid), among other qualities. Produced water is generated simultaneously with oil during production and has high concentrations of metals and contaminants, so it’s necessary to treat for disposal this water. The produced water from the fields of Urucu-AM and Riacho da forquilha-RN have high concentrations of contaminants, metals and salts such as calcium and magnesium, complicating their treatment and disposal. Thus, the objective was to analyze the use of synthetic produced water with similar characteristics of produced water from Urucu-AM and Riacho da Forquilha-RN for formulate a water-based drilling mud, noting the influence of varying the concentration of calcium and magnesium into filtered and rheology tests. We conducted a simple 32 factorial experimental design for statistical modeling of data. The results showed that the varying concentrations of calcium and magnesium did not influence the rheology of the fluid, where in the plastic viscosity, apparent viscosity and the initial and final gels does not varied significantly. For the filtrate tests, calcium concentration in a linear fashion influenced chloride concentration, where when we have a higher concentration of calcium we have a higher the concentration of chloride in the filtrate. For the Urucu’s produced water based fluids, volume of filtrate was observed that the calcium concentration influences quadratically, this means that high calcium concentrations interfere with the power of the inhibitors used in the formulation of the filtered fluid. For Riacho’s produced water based fluid, Calcium’s influences is linear for volume of filtrate. The magnesium concentration was significant only for chloride concentration in a quadratic way just for Urucu’s produced water based fluids. The mud with maximum concentration of magnesium (9,411g/L), but minimal concentration of calcium (0,733g/L) showed good results. Therefore, a maximum water produced by magnesium concentration of 9,411g/L and the maximum calcium concentration of 0,733g/L can be used for formulating water-based drilling fluids, providing appropriate properties for this kind of fluid.
Antecedentes da intenção de uso de comentários de viagem on-line na escolha de um meio de hospedagem
Resumo:
The Internet is present in each step of a trip planning. The constant technological advances has made major changes in the tourism industry. This is noticeable by the growing number of people who share their travel experiences on the Internet. This study has aimed to analyze the factors that influence the use of the Online Travel Reviews (OTR) in choosing an accommodation. It was done an investigation into the comments available on the internet about information on touristic products and services, specifically about accommodations. The research proposed to understand the influencing factors of OTR, in the Brazilian context, through the Technology Acceptance Model, Motivational Theory, Similarity, and Trustworthiness. The methodology used was a descriptive-exploratory study, with a quantitative approach, and bibliographic research. The study used a Structural Equation Modeling technique called Partial Least Squares (PLS), to test and evaluate the proposed research model. Data collection was performed with 308 guests hosted in five hotels in Ponta Negra (Natal/RN), who have used the OTRs in choosing an accommodation. The research tested fifteen hypotheses, where nine were confirmed, and six were rejected. The results showed that guests have attitude and intention to use the OTRs to choose an accommodation.
Resumo:
Currently the market requires increasingly pure oil derivatives and, with that, comes the need for new methods for obtaining those products that are more efficient and economically viable. Considering the removal of sulfur from diesel, most refineries uses catalytic hydrogenation process, the hydrodesulfurization. These processes needs high energy content and high cost of production and has low efficiency in removing sulfur at low concentrations (below 500 ppm). The adsorption presents itself as an efficient and economically viable alternative in relation to the techniques currently used. With that, the main purpose of this work is to develop and optimize the obtaining of new adsorbents based on diatomite, modified with two non ionic surfactants microemulsions, adding efficiency to the material, to its application on removal of sulfur present in commercial diesel. Analyses were undertaken of scanning electron microscopy (SEM), x-ray diffraction (XRD), x-ray fluorescence (XRF), thermogravimetry (TG) and N2 adsorption (BET) for characterization of new materials obtained. The variables used for diatomite modification were: microemulsion points for each surfactant (RNX 95 and UNTL 90), microemulsion aqueous phase through the use or non-use of salts (CaCl2 and BaCl2), the contact time during the modification and the contact form. The study of adsorption capacity of materials obtained was performed using a statistical modeling to evaluate the influence of salt concentration in the aqueous phase (20 ppm to 1500 ppm), finite bath temperature (25 to 60° C) and the concentration of sulphur in diesel. It was observed that the temperature and the concentration of sulphur (300 to 1100 ppm) were the most significant parameters, in which increasing their values increase the ability of modified clay to adsorb the sulphur in diesel fuel. Adsorption capacity increased from 0.43 to mg/g 1.34 mg/g with microemulsion point optimization and with the addition of salts.
Análise de volatilidade, integração de preços e previsibilidade para o mercado brasileiro de camarão
Resumo:
The present paper has the purpose of investigate the dynamics of the volatility structure in the shrimp prices in the Brazilian fish market. Therefore, a description of the initial aspects of the shrimp price series was made. From this information, statistics tests were made and selected univariate models to be price predictors. Then, it was verified the existence of relationship of long-term equilibrium between the Brazilian and American imported shrimp and if, confirmed the relationship, whether or not there is a causal link between these assets, considering that the two countries had presented trade relations over the years. It is presented as an exploratory research of applied nature with quantitative approach. The database was collected through direct contact with the Companhia de Entrepostos e Armazéns Gerais de São Paulo (CEAGESP) and on the official website of American import, National Marine Fisheries Service - National Oceanic and Atmospheric Administration (NMFS- NOAA). The results showed that the great variability in the active price is directly related with the gain and loss of the market agents. The price series presents a strong seasonal and biannual effect. The average structure of price of shrimp in the last 12 years was R$ 11.58 and external factors besides the production and marketing (U.S. antidumping, floods and pathologies) strongly affected the prices. Among the tested models for predicting prices of shrimp, four were selected, which through the prediction methodologies of one step forward of horizon 12, proved to be statistically more robust. It was found that there is weak evidence of long-term equilibrium between the Brazilian and American shrimp, where equivalently, was not found a causal link between them. We concluded that the dynamic pricing of commodity shrimp is strongly influenced by external productive factors and that these phenomena cause seasonal effects in the prices. There is no relationship of long-term stability between the Brazilian and American shrimp prices, but it is known that Brazil imports USA production inputs, which somehow shows some dependence productive. To the market agents, the risk of interferences of the external prices cointegrated to Brazilian is practically inexistent. Through statistical modeling is possible to minimize the risk and uncertainty embedded in the fish market, thus, the sales and marketing strategies for the Brazilian shrimp can be consolidated and widespread
Resumo:
The biodistribution of sodium pertechnetate, the most used radiopharmaceutical in nuclear medicine, has not been studied in details after bariatric surgery. The objective was to investigate the effect of Roux-en-Y gastric bypass (RYGB) on biodistribution of sodium pertechnetate (Na99mTc-) in organs and tissues of rats. Methods: Twelve rats were randomly divided into two groups of 6 animals each. The RYGB group rats were submitted to the Roux-en-Y gastric bypass and the control group rats were not operated. After 15 days, all rats were injected with 0.1mL of Na99mTc- via orbital plexus with average radioactivity of 0.66 MBq. After 30 minutes, liver, stomach, thyroid, heart, lung, kidney and femur samples were harvested, weighed and percentage of radioactivity per gram (%ATI/g) of each organ was determined by gama counter Wizard Perkin-Elmer. We applied the Student t test for statistical analysis, considering p<0.05 as significant. Results: Significant reduction in mean %ATI/g was observed in the liver, stomach and femur in the RYGB group animals, compared with the control group rats (p<0.05). In other organs no significant difference in %ATI/g was observed between the two groups. Conclusion: This work contributes to the knowledge that the bariatric surgery RYGB modifies the pattern of biodistribution of Na99mTc
Resumo:
In recent decades the public sector comes under pressure in order to improve its performance. The use of Information Technology (IT) has been a tool increasingly used in reaching that goal. Thus, it has become an important issue in public organizations, particularly in institutions of higher education, determine which factors influence the acceptance and use of technology, impacting on the success of its implementation and the desired organizational results. The Technology Acceptance Model - TAM was used as the basis for this study and is based on the constructs perceived usefulness and perceived ease of use. However, when it comes to integrated management systems due to the complexity of its implementation,organizational factors were added to thus seek further explanation of the acceptance of such systems. Thus, added to the model five TAM constructs related to critical success factors in implementing ERP systems, they are: support of top management, communication, training, cooperation, and technological complexity (BUENO and SALMERON, 2008). Based on the foregoing, launches the following research problem: What factors influence the acceptance and use of SIE / module academic at the Federal University of Para, from the users' perception of teachers and technicians? The purpose of this study was to identify the influence of organizational factors, and behavioral antecedents of behavioral intention to use the SIE / module academic UFPA in the perspective of teachers and technical users. This is applied research, exploratory and descriptive, quantitative with the implementation of a survey, and data collection occurred through a structured questionnaire applied to a sample of 229 teachers and 30 technical and administrative staff. Data analysis was carried out through descriptive statistics and structural equation modeling with the technique of partial least squares (PLS). Effected primarily to assess the measurement model, which were verified reliability, convergent and discriminant validity for all indicators and constructs. Then the structural model was analyzed using the bootstrap resampling technique like. In assessing statistical significance, all hypotheses were supported. The coefficient of determination (R ²) was high or average in five of the six endogenous variables, so the model explains 47.3% of the variation in behavioral intention. It is noteworthy that among the antecedents of behavioral intention (BI) analyzed in this study, perceived usefulness is the variable that has a greater effect on behavioral intention, followed by ease of use (PEU) and attitude (AT). Among the organizational aspects (critical success factors) studied technological complexity (TC) and training (ERT) were those with greatest effect on behavioral intention to use, although these effects were lower than those produced by behavioral factors (originating from TAM). It is pointed out further that the support of senior management (TMS) showed, among all variables, the least effect on the intention to use (BI) and was followed by communications (COM) and cooperation (CO), which exert a low effect on behavioral intention (BI). Therefore, as other studies on the TAM constructs were adequate for the present research. Thus, the study contributed towards proving evidence that the Technology Acceptance Model can be applied to predict the acceptance of integrated management systems, even in public. Keywords: Technology
Resumo:
Forecast is the basis for making strategic, tactical and operational business decisions. In financial economics, several techniques have been used to predict the behavior of assets over the past decades.Thus, there are several methods to assist in the task of time series forecasting, however, conventional modeling techniques such as statistical models and those based on theoretical mathematical models have produced unsatisfactory predictions, increasing the number of studies in more advanced methods of prediction. Among these, the Artificial Neural Networks (ANN) are a relatively new and promising method for predicting business that shows a technique that has caused much interest in the financial environment and has been used successfully in a wide variety of financial modeling systems applications, in many cases proving its superiority over the statistical models ARIMA-GARCH. In this context, this study aimed to examine whether the ANNs are a more appropriate method for predicting the behavior of Indices in Capital Markets than the traditional methods of time series analysis. For this purpose we developed an quantitative study, from financial economic indices, and developed two models of RNA-type feedfoward supervised learning, whose structures consisted of 20 data in the input layer, 90 neurons in one hidden layer and one given as the output layer (Ibovespa). These models used backpropagation, an input activation function based on the tangent sigmoid and a linear output function. Since the aim of analyzing the adherence of the Method of Artificial Neural Networks to carry out predictions of the Ibovespa, we chose to perform this analysis by comparing results between this and Time Series Predictive Model GARCH, developing a GARCH model (1.1).Once applied both methods (ANN and GARCH) we conducted the results' analysis by comparing the results of the forecast with the historical data and by studying the forecast errors by the MSE, RMSE, MAE, Standard Deviation, the Theil's U and forecasting encompassing tests. It was found that the models developed by means of ANNs had lower MSE, RMSE and MAE than the GARCH (1,1) model and Theil U test indicated that the three models have smaller errors than those of a naïve forecast. Although the ANN based on returns have lower precision indicator values than those of ANN based on prices, the forecast encompassing test rejected the hypothesis that this model is better than that, indicating that the ANN models have a similar level of accuracy . It was concluded that for the data series studied the ANN models show a more appropriate Ibovespa forecasting than the traditional models of time series, represented by the GARCH model
Resumo:
In recent decades, changes have been occurring in the telecommunications industry, allied to competition driven by the policies of privatization and concessions, have fomented the world market irrefutably causing the emergence of a new reality. The reflections in Brazil have become evident due to the appearance of significant growth rates, getting in 2012 to provide a net operating income of 128 billion dollars, placing the country among the five major powers in the world in mobile communications. In this context, an issue of increasing importance to the financial health of companies is their ability to retain their customers, as well as turn them into loyal customers. The appearance of infidelity from customer operators has been generating monthly rates shutdowns about two to four percent per month accounting for business management one of its biggest challenges, since capturing a new customer has meant an expenditure greater than five times to retention. For this purpose, models have been developed by means of structural equation modeling to identify the relationships between the various determinants of customer loyalty in the context of services. The original contribution of this thesis is to develop a model for loyalty from the identification of relationships between determinants of satisfaction (latent variables) and the inclusion of attributes that determine the perceptions of service quality for the mobile communications industry, such as quality, satisfaction, value, trust, expectation and loyalty. It is a qualitative research which will be conducted with customers of operators through simple random sampling technique, using structured questionnaires. As a result, the proposed model and statistical evaluations should enable operators to conclude that customer loyalty is directly influenced by technical and operational quality of the services offered, as well as provide a satisfaction index for the mobile communication segment
Resumo:
O objetivo deste estudo foi comparar a fusibilidade de ligas de Co-Cr-Mo-W (Remanium 2000), Ni-Cr (Durabond) e Co-Cr-Mo (Vera-PDI), incluídas em revestimentos à base de fosfato, sílica ou utilizando uma técnica mista. Uma rede de nylon quadrada (10 X 10 mm) com 100 espaços abertos serviu de modelo para construção de padrões de cera, que foram incluídos com revestimento à base de sílica, revestimento fosfatado e técnica mista (camada de revestimento fosfatado com 2 mm de espessura + revestimento à base de sílica). Quarenta e cinco espécimes (5 para cada condição experimental) foram fundidos sob chama de gás-oxigênio e a seguir jateados com óxido de alumínio. O número de segmentos fundidos completos foi contado para obter uma percentagem designada como "valor de fusibilidade", representando a precisão da liga em reproduzir os detalhes do molde. A análise estatística por meio de ANOVA a dois critérios e teste Tukey mostrou que, comparando-se as ligas, a Remanium 2000 teve fusibilidade estaticamente semelhante (p>0,05) à da Vera PDI e inferior à da liga Durabond (p<0,05). Considerando os resultados da técnica mista, a liga Remanium 2000 teve menor valor de fusibilidade (p<0,05) que as ligas Durabond e Vera PDI, que apresentaram valores estatisticamente semelhantes entre si (p>0,05). Concluindo, a fusibilidade da liga de Co-Cr-Mo-W (Remanium 2000) foi comparável à da liga de Co-Cr (Vera PDI) e inferior à da liga de Ni-Cr alloy (Durabond). À exceção da liga Remanium 2000, a técnica de inclusão mista aumentou consideravelmente a capacidade das ligas testadas de reproduzir os detalhes do molde, quando comparada à técnica de inclusão em revestimento fosfatado. A técnica de inclusão mista representa uma alternativa para melhorar a fusibilidade de ligas de metais básicos sem afetar a qualidade superficial das peças metálicas.
Resumo:
O objetivo deste estudo foi comparar a fusibilidade de ligas de Co-Cr-Mo-W (Remanium 2000), Ni-Cr (Durabond) e Co-Cr-Mo (Vera-PDI), incluídas em revestimentos à base de fosfato, sílica ou utilizando uma técnica mista. Uma rede de nylon quadrada (10 X 10 mm) com 100 espaços abertos serviu de modelo para construção de padrões de cera, que foram incluídos com revestimento à base de sílica, revestimento fosfatado e técnica mista (camada de revestimento fosfatado com 2 mm de espessura + revestimento à base de sílica). Quarenta e cinco espécimes (5 para cada condição experimental) foram fundidos sob chama de gás-oxigênio e a seguir jateados com óxido de alumínio. O número de segmentos fundidos completos foi contado para obter uma percentagem designada como "valor de fusibilidade", representando a precisão da liga em reproduzir os detalhes do molde. A análise estatística por meio de ANOVA a dois critérios e teste Tukey mostrou que, comparando-se as ligas, a Remanium 2000 teve fusibilidade estaticamente semelhante (p>0,05) à da Vera PDI e inferior à da liga Durabond (p<0,05). Considerando os resultados da técnica mista, a liga Remanium 2000 teve menor valor de fusibilidade (p<0,05) que as ligas Durabond e Vera PDI, que apresentaram valores estatisticamente semelhantes entre si (p>0,05). Concluindo, a fusibilidade da liga de Co-Cr-Mo-W (Remanium 2000) foi comparável à da liga de Co-Cr (Vera PDI) e inferior à da liga de Ni-Cr alloy (Durabond). À exceção da liga Remanium 2000, a técnica de inclusão mista aumentou consideravelmente a capacidade das ligas testadas de reproduzir os detalhes do molde, quando comparada à técnica de inclusão em revestimento fosfatado. A técnica de inclusão mista representa uma alternativa para melhorar a fusibilidade de ligas de metais básicos sem afetar a qualidade superficial das peças metálicas.
Resumo:
The biodistribution of sodium pertechnetate, the most used radiopharmaceutical in nuclear medicine, has not been studied in details after bariatric surgery. The objective was to investigate the effect of Roux-en-Y gastric bypass (RYGB) on biodistribution of sodium pertechnetate (Na99mTc-) in organs and tissues of rats. Methods: Twelve rats were randomly divided into two groups of 6 animals each. The RYGB group rats were submitted to the Roux-en-Y gastric bypass and the control group rats were not operated. After 15 days, all rats were injected with 0.1mL of Na99mTc- via orbital plexus with average radioactivity of 0.66 MBq. After 30 minutes, liver, stomach, thyroid, heart, lung, kidney and femur samples were harvested, weighed and percentage of radioactivity per gram (%ATI/g) of each organ was determined by gama counter Wizard Perkin-Elmer. We applied the Student t test for statistical analysis, considering p<0.05 as significant. Results: Significant reduction in mean %ATI/g was observed in the liver, stomach and femur in the RYGB group animals, compared with the control group rats (p<0.05). In other organs no significant difference in %ATI/g was observed between the two groups. Conclusion: This work contributes to the knowledge that the bariatric surgery RYGB modifies the pattern of biodistribution of Na99mTc