958 resultados para Data Generation
Resumo:
RESUMO:O investimento directo estrangeiro tem sido um dos factores com maior importância, no crescimento económico dos países em desenvolvimento, por contribuir para financiar o défice da balança corrente com o exterior, em particular a balança comercial. Num âmbito mais microeconómico é um forte gerador de emprego, proporciona avanços tecnológicos importantes, permitindo a partilha de conhecimentos das tecnologias, o conhecimento de novas formas de gestão e novas formas de marketing. Este trabalho tem como objectivo principal, identificar potenciais variáveis como indicadores avançados para o investimento directo estrangeiro, de modo a antecipar possíveis tendências para a sua evolução. Para alcançar este propósito recorreu-se aos Modelos Autoregressivos Vectoriais (VAR) e à causalidade de Granger com base em dados mensais para o período de Janeiro de 1996 a Setembro de 2010. Foram consideradas variáveis essenvialmente macroeconómicas, tanto do lado da economia receptora como dos países investidores, de modo a reflectirem a actividade económica ao longo do período de estudo. ABSTRACT: The foreign direct investment, has been one of the main factors in the economical development for the countries that are in a process of developing, because it allows the generation of new investments and generate money from the return of the investment, as well as it creates new opportunities for the employment. It allows important technologic advances with the share of the technology Knowledge as well new ways to learn marketing management and enterprise management. This work/research, aims to identify potential variables as advanced indicators for the foreign direct investment, in order to anticipate possible trends of their evolution. To achieve this goal, Vector Autoregressive Models (VAR) and Granger causality based on based on monthly data for the period January between 1996 and September of 2010, were used. Essentially macroeconomic variables were considered, on both the host economy and the countries investors in order to reflect the economic activity throughout the study period.
Resumo:
The nutritional status according to anthropometric data was assessed in 756 schoolchildren from 5 low-income state schools and in one private school in the same part of Rio de Janeiro, Brazil. The prevalence of stunting and wasting (cut-off point: <90% ht/age and <80% wt/ht) ranged in the public schools from 6.2 to 15.2% and 3.3 to 24.0%, respectively, whereas the figures for the private school were 2.3 and 3.5%, respectively. Much more obesity was found in the private school (18.0%) than in the state schools (0.8 - 6.2%). Nutritional problems seem to develop more severely in accordance with the increasing age of the children. Therefore it appears advisable to assess schoolchildren within the context of a nutritional surveillance system.
Resumo:
Lossless compression algorithms of the Lempel-Ziv (LZ) family are widely used nowadays. Regarding time and memory requirements, LZ encoding is much more demanding than decoding. In order to speed up the encoding process, efficient data structures, like suffix trees, have been used. In this paper, we explore the use of suffix arrays to hold the dictionary of the LZ encoder, and propose an algorithm to search over it. We show that the resulting encoder attains roughly the same compression ratios as those based on suffix trees. However, the amount of memory required by the suffix array is fixed, and much lower than the variable amount of memory used by encoders based on suffix trees (which depends on the text to encode). We conclude that suffix arrays, when compared to suffix trees in terms of the trade-off among time, memory, and compression ratio, may be preferable in scenarios (e.g., embedded systems) where memory is at a premium and high speed is not critical.
Resumo:
Fluorescent protein microscopy imaging is nowadays one of the most important tools in biomedical research. However, the resulting images present a low signal to noise ratio and a time intensity decay due to the photobleaching effect. This phenomenon is a consequence of the decreasing on the radiation emission efficiency of the tagging protein. This occurs because the fluorophore permanently loses its ability to fluoresce, due to photochemical reactions induced by the incident light. The Poisson multiplicative noise that corrupts these images, in addition with its quality degradation due to photobleaching, make long time biological observation processes very difficult. In this paper a denoising algorithm for Poisson data, where the photobleaching effect is explicitly taken into account, is described. The algorithm is designed in a Bayesian framework where the data fidelity term models the Poisson noise generation process as well as the exponential intensity decay caused by the photobleaching. The prior term is conceived with Gibbs priors and log-Euclidean potential functions, suitable to cope with the positivity constrained nature of the parameters to be estimated. Monte Carlo tests with synthetic data are presented to characterize the performance of the algorithm. One example with real data is included to illustrate its application.
Resumo:
One of the most efficient approaches to generate the side information (SI) in distributed video codecs is through motion compensated frame interpolation where the current frame is estimated based on past and future reference frames. However, this approach leads to significant spatial and temporal variations in the correlation noise between the source at the encoder and the SI at the decoder. In such scenario, it would be useful to design an architecture where the SI can be more robustly generated at the block level, avoiding the creation of SI frame regions with lower correlation, largely responsible for some coding efficiency losses. In this paper, a flexible framework to generate SI at the block level in two modes is presented: while the first mode corresponds to a motion compensated interpolation (MCI) technique, the second mode corresponds to a motion compensated quality enhancement (MCQE) technique where a low quality Intra block sent by the encoder is used to generate the SI by doing motion estimation with the help of the reference frames. The novel MCQE mode can be overall advantageous from the rate-distortion point of view, even if some rate has to be invested in the low quality Intra coding blocks, for blocks where the MCI produces SI with lower correlation. The overall solution is evaluated in terms of RD performance with improvements up to 2 dB, especially for high motion video sequences and long Group of Pictures (GOP) sizes.
Resumo:
Opposite enantiomers exhibit different NMR properties in the presence of an external common chiral element, and a chiral molecule exhibits different NMR properties in the presence of external enantiomeric chiral elements. Automatic prediction of such differences, and comparison with experimental values, leads to the assignment of the absolute configuration. Here two cases are reported, one using a dataset of 80 chiral secondary alcohols esterified with (R)-MTPA and the corresponding 1H NMR chemical shifts and the other with 94 13C NMR chemical shifts of chiral secondary alcohols in two enantiomeric chiral solvents. For the first application, counterpropagation neural networks were trained to predict the sign of the difference between chemical shifts of opposite stereoisomers. The neural networks were trained to process the chirality code of the alcohol as the input, and to give the NMR property as the output. In the second application, similar neural networks were employed, but the property to predict was the difference of chemical shifts in the two enantiomeric solvents. For independent test sets of 20 objects, 100% correct predictions were obtained in both applications concerning the sign of the chemical shifts differences. Additionally, with the second dataset, the difference of chemical shifts in the two enantiomeric solvents was quantitatively predicted, yielding r2 0.936 for the test set between the predicted and experimental values.
Resumo:
O cancro da mama e o cancro colorretal constituem duas das principais causas de morte a nível mundial. Entre 5 a 10% destes casos estão associados a variantes germinais/hereditárias em genes de suscetibilidade para cancro. O objetivo deste trabalho consistiu em validar a utilização da sequenciação de nova geração (NGS) para identificar variantes previamente detetadas pelo método de Sanger em diversos genes de suscetibilidade para cancro da mama e colorretal. Foram sequenciadas por NGS 64 amostras de DNA de utentes com suspeita clínica de predisposição hereditária para cancro da mama ou colorretal, utilizando o painel de sequenciação TruSight Cancer e a plataforma MiSeq (Illumina). Estas amostras tinham sido previamente sequenciadas pelo método de Sanger para os genes BRCA1, BRCA2, TP53, APC, MUTYH, MLH1, MSH2 e STK11. A análise bioinformática dos resultados foi realizada com os softwares MiSeq Reporter, VariantStudio, Isaac Enrichment (Illumina) e Integrative Genomics Viewer (Broad Institute). A NGS demonstrou elevada sensibilidade e especificidade analíticas para a deteção de variantes de sequência em 8 genes de suscetibilidade para cancro colorretal e da mama, uma vez que permitiu identificar a totalidade das 412 variantes (93 únicas, incluindo 27 variantes patogénicas) previamente detetadas pelo método de Sanger. A utilização de painéis de sequenciação de genes de predisposição para cancro por NGS vem possibilitar um diagnóstico molecular mais abrangente, rápido e custo-eficiente, relativamente às metodologias convencionais.
Resumo:
This paper presents an investigation into cloud-to-ground lightning activity over the continental territory of Portugal with data collected by the national Lightning Location System. The Lightning Location System in Portugal is first presented. Analyses about geographical, seasonal, and polarity distribution of cloud-to-ground lightning activity and cumulative probability of peak current are carried out. An overall ground flash density map is constructed from the database, which contains the information of more than five years and almost four million records. This map is compared with the thunderstorm days map, produced by the Portuguese Institute of Meteorology, and with the orographic map of Portugal. Finally, conclusions are duly drawn.
Resumo:
We present a study of the magnetic properties of a group of basalt samples from the Saldanha Massif (Mid-Atlantic Ridge - MAR - 36degrees 33' 54" N, 33degrees 26' W), and we set out to interpret these properties in the tectono-magmatic framework of this sector of the MAR. Most samples have low magnetic anisotropy and magnetic minerals of single domain grain size, typical of rapid cooling. The thermomagnetic study mostly shows two different susceptibility peaks. The high temperature peak is related to mineralogical alteration due to heating. The low temperature peak shows a distinction between three different stages of low temperature oxidation: the presence of titanomagnetite, titanomagnetite and titanomaghemite, and exclusively of titanomaghemite. Based on established empirical relationships between Curie temperature and degree of oxidation, the latter is tentatively deduced for all samples. Finally, swath bathymetry and sidescan sonar data combined with dive observations show that the Saldanha Massif is located over an exposed section of upper mantle rocks interpreted to be the result of detachment tectonics. Basalt samples inside the detachment zone often have higher than expected oxidation rates; this effect can be explained by the higher permeability caused by the detachment fault activity.
Resumo:
The 27 December 1722 Algarve earthquake destroyed a large area in southern Portugal generating a local tsunami that inundated the shallow areas of Tavira. It is unclear whether its source was located onshore or offshore and, in any case, what was the tectonic source responsible for the event. We analyze available historical information concerning macroseismicity and the tsunami to discuss the most probable location of the source. We also review available seismotectonic knowledge of the offshore region close to the probable epicenter, selecting a set of four candidate sources. We simulate tsunamis produced by these candidate sources assuming that the sea bottom displacement is caused by a compressive dislocation over a rectangular fault, as given by the half-space homogeneous elastic approach, and we use numerical modeling to study wave propagation and run-up. We conclude that the 27 December 1722 Tavira earthquake and tsunami was probably generated offshore, close to 37 degrees 01'N, 7 degrees 49'W.
Resumo:
Although stock prices fluctuate, the variations are relatively small and are frequently assumed to be normal distributed on a large time scale. But sometimes these fluctuations can become determinant, especially when unforeseen large drops in asset prices are observed that could result in huge losses or even in market crashes. The evidence shows that these events happen far more often than would be expected under the generalized assumption of normal distributed financial returns. Thus it is crucial to properly model the distribution tails so as to be able to predict the frequency and magnitude of extreme stock price returns. In this paper we follow the approach suggested by McNeil and Frey (2000) and combine the GARCH-type models with the Extreme Value Theory (EVT) to estimate the tails of three financial index returns DJI,FTSE 100 and NIKKEI 225 representing three important financial areas in the world. Our results indicate that EVT-based conditional quantile estimates are much more accurate than those from conventional AR-GARCH models assuming normal or Student’s t-distribution innovations when doing out-of-sample estimation (within the insample estimation, this is so for the right tail of the distribution of returns).
Resumo:
Orientador Prof. Dr. João Domingues Costa
Resumo:
The main purpose of this study was to examine the applicability of geostatistical modeling to obtain valuable information for assessing the environmental impact of sewage outfall discharges. The data set used was obtained in a monitoring campaign to S. Jacinto outfall, located off the Portuguese west coast near Aveiro region, using an AUV. The Matheron’s classical estimator was used the compute the experimental semivariogram which was fitted to three theoretical models: spherical, exponential and gaussian. The cross-validation procedure suggested the best semivariogram model and ordinary kriging was used to obtain the predictions of salinity at unknown locations. The generated map shows clearly the plume dispersion in the studied area, indicating that the effluent does not reach the near by beaches. Our study suggests that an optimal design for the AUV sampling trajectory from a geostatistical prediction point of view, can help to compute more precise predictions and hence to quantify more accurately dilution. Moreover, since accurate measurements of plume’s dilution are rare, these studies might be very helpful in the future for validation of dispersion models.
Resumo:
Business Intelligence (BI) is one emergent area of the Decision Support Systems (DSS) discipline. Over the last years, the evolution in this area has been considerable. Similarly, in the last years, there has been a huge growth and consolidation of the Data Mining (DM) field. DM is being used with success in BI systems, but a truly DM integration with BI is lacking. Therefore, a lack of an effective usage of DM in BI can be found in some BI systems. An architecture that pretends to conduct to an effective usage of DM in BI is presented.
Resumo:
A Blumlein line is a particular Pulse Forming Line, PFL, configuration that allows the generation of high-voltage sub-microsecond square pulses, with the same voltage amplitude as the dc charging voltage, into a matching load. By stacking n Blumlein lines one can multiply in theory by n the input dc voltage charging amplitude. In order to understand the operating behavior of this electromagnetic system and to further optimize its operation it is fundamental to theoretically model it, that is to calculate the voltage amplitudes at each circuit point and the time instant that happens. In order to do this, one needs to define the reflection and transmission coefficients where impedance discontinuity occurs. The experimental results of a fast solid-state switch, which discharges a three stage Blumlein stack, will be compared with theoretical ones.