887 resultados para automatic particle picking
Resumo:
The evolution of integrated circuits technologies demands the development of new CAD tools. The traditional development of digital circuits at physical level is based in library of cells. These libraries of cells offer certain predictability of the electrical behavior of the design due to the previous characterization of the cells. Besides, different versions of each cell are required in such a way that delay and power consumption characteristics are taken into account, increasing the number of cells in a library. The automatic full custom layout generation is an alternative each time more important to cell based generation approaches. This strategy implements transistors and connections according patterns defined by algorithms. So, it is possible to implement any logic function avoiding the limitations of the library of cells. Tools of analysis and estimate must offer the predictability in automatic full custom layouts. These tools must be able to work with layout estimates and to generate information related to delay, power consumption and area occupation. This work includes the research of new methods of physical synthesis and the implementation of an automatic layout generation in which the cells are generated at the moment of the layout synthesis. The research investigates different strategies of elements disposition (transistors, contacts and connections) in a layout and their effects in the area occupation and circuit delay. The presented layout strategy applies delay optimization by the integration with a gate sizing technique. This is performed in such a way the folding method allows individual discrete sizing to transistors. The main characteristics of the proposed strategy are: power supply lines between rows, over the layout routing (channel routing is not used), circuit routing performed before layout generation and layout generation targeting delay reduction by the application of the sizing technique. The possibility to implement any logic function, without restrictions imposed by a library of cells, allows the circuit synthesis with optimization in the number of the transistors. This reduction in the number of transistors decreases the delay and power consumption, mainly the static power consumption in submicrometer circuits. Comparisons between the proposed strategy and other well-known methods are presented in such a way the proposed method is validated.
Resumo:
This study aims to contribute on the forecasting literature in stock return for emerging markets. We use Autometrics to select relevant predictors among macroeconomic, microeconomic and technical variables. We develop predictive models for the Brazilian market premium, measured as the excess return over Selic interest rate, Itaú SA, Itaú-Unibanco and Bradesco stock returns. We nd that for the market premium, an ADL with error correction is able to outperform the benchmarks in terms of economic performance. For individual stock returns, there is a trade o between statistical properties and out-of-sample performance of the model.
Resumo:
This study aims to contribute on the forecasting literature in stock return for emerging markets. We use Autometrics to select relevant predictors among macroeconomic, microeconomic and technical variables. We develop predictive models for the Brazilian market premium, measured as the excess return over Selic interest rate, Itaú SA, Itaú-Unibanco and Bradesco stock returns. We find that for the market premium, an ADL with error correction is able to outperform the benchmarks in terms of economic performance. For individual stock returns, there is a trade o between statistical properties and out-of-sample performance of the model.
Resumo:
The synthetic control (SC) method has been recently proposed as an alternative method to estimate treatment e ects in comparative case studies. Abadie et al. [2010] and Abadie et al. [2015] argue that one of the advantages of the SC method is that it imposes a data-driven process to select the comparison units, providing more transparency and less discretionary power to the researcher. However, an important limitation of the SC method is that it does not provide clear guidance on the choice of predictor variables used to estimate the SC weights. We show that such lack of speci c guidances provides signi cant opportunities for the researcher to search for speci cations with statistically signi cant results, undermining one of the main advantages of the method. Considering six alternative speci cations commonly used in SC applications, we calculate in Monte Carlo simulations the probability of nding a statistically signi cant result at 5% in at least one speci cation. We nd that this probability can be as high as 13% (23% for a 10% signi cance test) when there are 12 pre-intervention periods and decay slowly with the number of pre-intervention periods. With 230 pre-intervention periods, this probability is still around 10% (18% for a 10% signi cance test). We show that the speci cation that uses the average pre-treatment outcome values to estimate the weights performed particularly bad in our simulations. However, the speci cation-searching problem remains relevant even when we do not consider this speci cation. We also show that this speci cation-searching problem is relevant in simulations with real datasets looking at placebo interventions in the Current Population Survey (CPS). In order to mitigate this problem, we propose a criterion to select among SC di erent speci cations based on the prediction error of each speci cations in placebo estimations
Resumo:
A key to maintain Enterprises competitiveness is the ability to describe, standardize, and adapt the way it reacts to certain types of business events, and how it interacts with suppliers, partners, competitors, and customers. In this context the field of organization modeling has emerged with the aim to create models that help to create a state of self-awareness in the organization. This project's context is the use of Semantic Web in the Organizational modeling area. The Semantic Web technology advantages can be used to improve the way of modeling organizations. This was accomplished using a Semantic wiki to model organizations. Our research and implementation had two main purposes: formalization of textual content in semantic wiki pages; and automatic generation of diagrams from organization data stored in the semantic wiki pages.
Resumo:
Nowadays, more than half of the computer development projects fail to meet the final users' expectations. One of the main causes is insufficient knowledge about the organization of the enterprise to be supported by the respective information system. The DEMO methodology (Design and Engineering Methodology for Organizations) has been proved as a well-defined method to specify, through models and diagrams, the essence of any organization at a high level of abstraction. However, this methodology is platform implementation independent, lacking the possibility of saving and propagating possible changes from the organization models to the implemented software, in a runtime environment. The Universal Enterprise Adaptive Object Model (UEAOM) is a conceptual schema being used as a basis for a wiki system, to allow the modeling of any organization, independent of its implementation, as well as the previously mentioned change propagation in a runtime environment. Based on DEMO and UEAOM, this project aims to develop efficient and standardized methods, to enable an automatic conversion of DEMO Ontological Models, based on UEAOM specification into BPMN (Business Process Model and Notation) models of processes, using clear semantics, without ambiguities, in order to facilitate the creation of processes, almost ready for being executed on workflow systems that support BPMN.
Resumo:
Em um segmento de vertente com substrato de arenito em contato com basalto, regionalmente muito freqüente, pretendeu-se não só relacionar as superfícies geomórficas com os atributos físicos, químicos e mineralógicos dos Latossolos nelas encontrados, mas também testar métodos geoestatísticos para localização de limites dessas superfícies. Usando critérios geomorfológicos, três superfícies foram identificadas e topograficamente caracterizadas. Os solos foram amostrados, a intervalos regulares de 25 m, na profundidade de 0,6 a 0,8 m (topo do horizonte B), em uma transeção de 1.700 m perfazendo 109 pontos. Nas amostras, foram analisados: densidade de partículas, granulometria, CTC do solo, CTC da argila, Fe total da argila (ataque por H2SO4) e óxidos de Fe livres (por dissolução seletiva). A fração argila desferrificada foi analisada por difração de raios X. Com base na estratigrafia e variações do relevo local, foram identificadas e diferenciadas, no campo, três superfícies geomórficas. Analisaram-se também o perfil altimétrico e o modelo de elevação digital do terreno. Observou-se que as três diferentes superfícies estão bem relacionadas com os atributos físicos, químicos e mineralógicos dos seus respectivos solos. Na parte inferior desta vertente, superfície mais recente e sobre basalto, em Latossolo Vermelho eutroférrico típico, foram encontradas as maiores variabilidades da declividade, da argila e de Fe. As variações da inclinação do terreno, quando analisadas sistematicamente pelo split moving windows dissimilarity analysis (análise estatística de dissimilaridade, em segmentos móveis), mostraram que este método estatístico pode ser usado para ajudar a localizar os limites entre superfícies geomórficas. As variações dos solos da transeção, e arredores, mostraram-se relacionadas com idade, inclinação do terreno e litologia. O trabalho geomórfico detalhado forneceu importantes informações para subsidiar os trabalhos de levantamento de solos e de pedogênese.
Resumo:
The objective of this experiment was to investigate the effects of different particle sizes, expressed as Geometric Mean Diameter (GMD) of corn (0.336mm, 0.585mm, 0.856mm and 1.12mm) of mash and pelleted broiler chicken diets on the weight of the gizzard, duodenum and jejunum+ileum; on the pH of the gizzard and small intestine and on the characteristics of the duodenal mucous layer (number and height of villi and crypt depth) in 42-day-old broilers. The physical form and the particle size of the diet had no significant effect on gizzard and intestine pH (p > 0.05). A greater gizzard weight was seen in the birds receiving pelleted diet and particle size of 0.336mm (p < 0.008). However, for the particle sizes of 0.856 and 1.12 mm, a greater weight was found in birds that received mash diet (p < 0.039 and p < 0.006, respectively). Also, gizzard weight was greater with increasing corn GMD independent of the physical form of the diet. In the mash diet, the increase in particle size promoted a quadratic response in the weight of duodenum and jejunum + ileum. The pelleted diet promoted a greater number of villi per transverse duodenum cut (p < 0.007) and greater crypt depth (p < 0.05). As the particle size increased, there was a linear increase of villus height and crypt depth in the duodenum, irrespective of the physical form of the diet.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Image restoration attempts to enhance images corrupted by noise and blurring effects. Iterative approaches can better control the restoration algorithm in order to find a compromise of restoring high details in smoothed regions without increasing the noise. Techniques based on Projections Onto Convex Sets (POCS) have been extensively used in the context of image restoration by projecting the solution onto hyperspaces until some convergence criteria be reached. It is expected that an enhanced image can be obtained at the final of an unknown number of projections. The number of convex sets and its combinations allow designing several image restoration algorithms based on POCS. Here, we address two convex sets: Row-Action Projections (RAP) and Limited Amplitude (LA). Although RAP and LA have already been used in image restoration domain, the former has a relaxation parameter (A) that strongly depends on the characteristics of the image that will be restored, i.e., wrong values of A can lead to poorly restoration results. In this paper, we proposed a hybrid Particle Swarm Optimization (PS0)-POCS image restoration algorithm, in which the A value is obtained by PSO to be further used to restore images by POCS approach. Results showed that the proposed PSO-based restoration algorithm outperformed the widely used Wiener and Richardson-Lucy image restoration algorithms. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Esse trabalho tem por objetivo o desenvolvimento de um sistema inteligente para detecção da queima no processo de retificação tangencial plana através da utilização de uma rede neural perceptron multi camadas, treinada para generalizar o processo e, conseqüentemente, obter o limiar de queima. em geral, a ocorrência da queima no processo de retificação pode ser detectada pelos parâmetros DPO e FKS. Porém esses parâmetros não são eficientes nas condições de usinagem usadas nesse trabalho. Os sinais de emissão acústica e potência elétrica do motor de acionamento do rebolo são variáveis de entrada e a variável de saída é a ocorrência da queima. No trabalho experimental, foram empregados um tipo de aço (ABNT 1045 temperado) e um tipo de rebolo denominado TARGA, modelo ART 3TG80.3 NVHB.