951 resultados para specifications
Resumo:
In this paper, we decompose the variance of logarithmic monthly earnings of prime age males into its permanent and transitory components, using a five-wave rotating panel from the Venezuelan “Encuesta de Hogares por Muestreo” from 1995 to 1997. As far as we know, this is the first time a variance components model is estimated for a developing country. We test several specifications and find that an error component model with individual random effects and first order serially correlated errors fits the data well. In the simplest model, around 22% of earnings variance is explained by the variance of permanent component, 77% by purely stochastic variation and the remaining 1% by serial correlation. These results contrast with studies from industrial countries where the permanent component is predominant. The permanent component is usually interpreted as the results of productivity characteristics of individuals whereas the transitory component is due to stochastic perturbations such as job and/or price instability, among others. Our findings may be due to the timing of the panel when occurred precisely during macroeconomic turmoil resulting from a severe financial crisis. The findings suggest that earnings instability is an important source of inequality in a region characterized by high inequality and macroeconomic instability.
Resumo:
Based on three versions of a small macroeconomic model for Brazil, this paper presents empirical evidence on the effects of parameter uncertainty on monetary policy rules and on the robustness of optimal and simple rules over different model specifications. By comparing the optimal policy rule under parameter uncertainty with the rule calculated under purely additive uncertainty, we find that parameter uncertainty should make policymakers react less aggressively to the economy's state variables, as suggested by Brainard's "conservatism principIe", although this effect seems to be relatively small. We then informally investigate each rule's robustness by analyzing the performance of policy rules derived from each model under each one of the alternative models. We find that optimal rules derived from each model perform very poorly under alternative models, whereas a simple Taylor rule is relatively robusto We also fmd that even within a specific model, the Taylor rule may perform better than the optimal rule under particularly unfavorable realizations from the policymaker' s loss distribution function.
Resumo:
Esta tese tem por objetivo analisar a equidade educacional no Brasil, em 2001 e 2011. Para tanto, avaliarei as oportunidades dos alunos de terem professores mais qualificados, verificando a evolução por gênero, cor/raça e nível socioeconômico. O referencial teórico da tese divide-se em duas partes: o debate sobre equidade e a discussão sobre a importância da escola e de seus recursos e processos internos. Na primeira parte, apresento diversas perspectivas sobre equidade e proponho a divisão do campo em três linhas, além de inserir o debate num contexto mais amplo sobre justiça social. Utilizarei, nesta pesquisa, a abordagem pluralista, que entende equidade como um termo que engloba recursos, processos e resultados. Na segunda parte, trato da importância da escola para o desempenho dos alunos a partir da apresentação de pesquisas que comprovam que as escolas são peças fundamentais para o desempenho dos alunos, principalmente em países desiguais. Dentre os recursos e processos escolares, verifica-se que os professores têm o maior impacto sobre as notas – e, portanto, são capazes de trazer maior contribuição para políticas educacionais equitativas. O debate sobre a importância da escola e dos professores para o desempenho dos alunos é particularmente relevante no Brasil, onde a literatura demonstra um grande impacto das escolas e dos professores sobre os resultados escolares, atrelado a uma enorme desigualdade na distribuição dos recursos. Feita a discussão teórica, apresento o modelo elaborado para analisar a equidade no Brasil, em 2001 e 2011. Desenvolvi um modelo logístico para determinar as oportunidades de um aluno ter professores mais qualificados – classificados, neste trabalho, a partir de quatro características: diploma de ensino superior, pós-graduação, experiência em sala de aula e que faça a cobertura do currículo. São apresentadas duas especificações do modelo: simples, que contém apenas características dos alunos como variáveis independentes; e completa, que, além das características dos alunos, inclui informações sobre os estados, sobre o tipo de rede (privada ou pública) e sobre a localidade (rural ou urbana). As análises são conduzidas para o 5º e 9º ano, em 2001 e 2011, utilizando os dados do SAEB de Língua Portuguesa e Matemática. Os resultados apontam para quatro conclusões importantes: em primeiro lugar, o nível socioeconômico mostrou-se mais relevante na determinação das oportunidades educacionais em comparação à cor/raça ou gênero dos alunos; em segundo lugar, há semelhança nas tendências de desempenho do 5º e 9º anos, com a identificação de três padrões de evolução das oportunidades educacionais; em terceiro lugar, verifica-se que o impacto das características observáveis dos professores sobre a proficiência dos alunos mudou de 2001 a 2011, em decorrência das políticas de incentivo à escolarização docente; por fim, verifica-se grande heterogeneidade dentre os estados brasileiros, com os estados das regiões Sul e Sudeste garantindo maior oportunidade educacional aos alunos. Com base nestes resultados, aponto, nas considerações finais, para três pilares presentes em países mais equitativos: a definição de padrões mínimos de recursos e processos; a adoção de políticas compensatórias e a discriminação positiva na distribuição dos recursos; e, por fim, o investimento nos professores.
Resumo:
Seriam as reformas tributárias condicionadas por fatores políticos? Verificando a escassez de trabalhos empíricos sobre o tema, elaboramos definição própria de reforma tributária e adotamos uma tipologia para esses fenômenos. Em seguida, compilamos a base de dados de reformas tributárias a partir das respostas aos formulários da pesquisa de campo IPES 2006, realizada pelo Banco Interamericano de Desenvolvimento, com apoio do Centro Interamericano de Administrações Tributárias (CIAT). Esses formulários foram preenchidos por funcionários especialistas dos ministérios de finanças latino-americanos, que reportaram reformas entre 1990 e 2004. Depois, construímos os índices de reformas tributárias, que foram utilizados como variáveis dependentes em nossos modelos. Os índices contribuem para o desenvolvimento de estudos quantitativos sobre reformas tributárias, portando flexibilidade para testar diversas hipóteses. Eles tornaram possível analisar separadamente os determinantes das reformas da tributação da renda e do consumo, das reformas gerais e direcionadas, das reformas tendentes a aumentar ou reduzir tributos. Nos testes, destacou-se a influência da lista fechada, indicando que a disciplina parlamentar é importante para aprovar reformas. Em menor número de especificações, foram também relevantes a magnitude distrital, o bicameralismo, o poder de decreto do presidente da república e seu ciclo eleitoral. Não captamos evidência de fatores políticos conjunturais, como a ideologia partidária e maioria do governo no parlamento. Do mesmo modo, a influência dos poderes presidenciais de agenda e veto não se confirmou. O domínio de um partido na coalizão de governo foi relevante somente quando vinculado à lista fechada nas eleições. Em geral, os resultados confirmam o impacto de fatores político-institucionais sobre reformas tributárias, não se observando o mesmo para fatores político-conjunturais. Além disso, foram observadas diferenças nos condicionantes políticos que definem reformas na tributação da renda e do consumo, direcionadas e gerais, expansivas e redutoras (incentivos). O estudo contribui para análise quantitativa dos condicionantes políticos das reformas tributárias na América Latina e fornece dados antes indisponíveis. Oferece evidência empírica, considerando diferentes tipos de reformas e de fatores políticos. O estudo conclui pela necessidade de incorporar as variáveis políticas nas análises que envolvam reformas tributárias, até então marcadas pelo domínio dos argumentos econômicos. Além disso, sugere que o aperfeiçoamento das instituições políticas é importante para melhorar as decisões de política tributária na América Latina.
Resumo:
Diante da importância que o tema da imigração adquiriu no país nos últimos anos, gerou-se uma necessidade de melhor entendimento dos efeitos econômicos causados por influxos populacionais dessa natureza. Todavia, sob o conhecimento dos autores, inexistem estudos para história recente brasileira acerca dos impactos dos imigrantes no mercado de trabalho, em especial, sobre o salário e o nível de emprego dos nativos. Com esse panorama em mente, os estudos realizados nesta tese visam dar os primeiros passos na investigação desse tema. O presente trabalho é composto por quatro capítulos, os quais examinam diferentes questões associadas aos efeitos da imigração no mercado de trabalho brasileiro. O primeiro capítulo motiva o tema da imigração no Brasil e, através de uma metodologia estrutural baseada no arcabouço da função CES multi-nível, simula o efeito na estrutura salarial em resposta a influxos imigratórios estipulados para o ano de 2010, data do último Censo Demográfico. Em particular, calcula-se que o impacto salarial médio decorrente de um influxo estipulado de 549 mil imigrantes, mesma magnitude do observado entre dezembro de 2010 e dezembro de 2011, estaria situado em torno de -0.25%. O segundo capítulo estima o grau de substituição entre imigrantes e nativos do mesmo grupo de habilidade e testa a hipótese de substituição perfeita suportada empiricamente por Borjas et al. (2012, 2008) e adotada no capítulo anterior. A metodologia empregada fundamenta-se no arcabouço estrutural desenvolvido em Manacorda et al. (2012) e Ottaviano & Peri (2012), o qual acrescenta um nível extra na função de produção CES multi-nível de Borjas (2003). As elasticidades de substituição estimadas sob diversas especificações variam entre 9 e 23, resultados que fortalecem a tese de substituição imperfeita preconizada por Card (2012). O terceiro capítulo estima dois tipos de elasticidades relacionadas ao impacto dos imigrantes sobre o rendimento do trabalho nativo através de uma metodologia alternativa baseada numa função de produção mais flexível e que não está sujeita a restrições tão austeras quanto a CES. As estimativas computadas para as elasticidades de substituição de Hicks subjacentes se situam entre 1.3 e 4.9, o que reforça as evidências de substituição imperfeita obtidas no Capítulo 2. Adicionalmente, os valores estimados para as elasticidades brutas dos salários dos nativos em relação às quantidades de imigrantes na produção são da ordem máxima de +-0.01. O quarto e último capítulo, por meio de uma metodologia fundamentada no arcabouço da função de custo Translog, examina como o nível de emprego dos nativos reage a alterações no custo do trabalho imigrante, uma questão que até o momento recebeu pouca atenção da literatura, conquanto apresente relevância para formulação de políticas imigratórias. Para todas as especificações de modelo e grupos de educação considerados, nossos resultados apontam que uma variação exógena no salário do imigrante produz apenas diminutos efeitos sobre o nível de emprego dos trabalhadores nativos brasileiros. Na maioria dos casos, não se pode rejeitar a hipótese de que nativo e imigrante não são nem p-complementares nem p-substitutos líquidos.
Resumo:
This article proposes an alternative methodology for estimating the effects of non-tariff measures on trade flows, based on the recent literature on gravity models. A two-stage Heckman selection model is applied to the case of Brazilian exports, where the second stage gravity equation is theoretically grounded on the seminal Melitz model of heterogeneous firms. This extended gravity equation highlights the role played by zero trade flows as well as firm heterogeneity in explaining bilateral trade among countries, two factors usually omitted in traditional gravity specifications found in previous literature. Last, it also proposes a economic rationale for the effects of NTM on trade flows, helping to shed some light on its main operating channels under a rather simple Cournot’s duopolistic competition framework.
Resumo:
Esta tese tem como objetivo a avaliação dos primeiros impactos da política de UPPs, buscando incorporar as relações de causalidade envolvidas através de análises de diferenças-em-diferenças com diversas especificações a fim de medir os impactos sobre criminalidade, desempenho escolar,renda, desigualdade, posse de ativos e imigração.
Resumo:
O presente trabalho examina as normas brasileiras relativas ao provisionamento e à divulgação de passivos contingentes com vistas especificamente à sua aplicação para processos judiciais e administrativos, que representam o tipo de passivo contingente que mais impacta o resultado da maior parte das companhias nacionais, de todos os segmentos. O texto das normas, portanto, é confrontado com a realidade prática do mercado e as especificidades inerentes aos processos judiciais e administrativos. Além de contextualizar e explicar o funcionamento das regras aplicáveis, com destaque para o Pronunciamento Técnico nº. 25, do Comitê de Pronunciamentos Contábeis, que se tornou obrigatório para as companhias abertas por meio da Deliberação CVM nº. 594, de 15 de setembro de 2009, é realizada sua análise crítica, identificando omissões que podem dificultar o desempenho das atividades de provisionamento de tais demandas pelos profissionais competentes, fomentando a falta de uniformidade desses registros entre as demonstrações financeiras das companhias, bem como possibilitando o gerenciamento de resultados. Em seguida, são avaliadas e propostas soluções para os problemas identificados, com destaque para o estabelecimento de valores percentuais aos critérios de classificação de risco e a definição de critérios para classificação de risco de perda e mensuração de valores de passivos contingentes, organizadas em forma de diretrizes de boas práticas de provisionamento e de divulgação de passivos contingentes decorrentes de processos judiciais e administrativos.
Resumo:
This article proposes an alternative methodology for estimating the effects of non-tariff measures on trade flows, based on the recent literature on gravity models. A two-stage Heckman selection model is applied to the case of Brazilian exports, where the second stage gravity equation is theoretically grounded on the seminal Melitz model of heterogeneous firms. This extended gravity equation highlights the role played by zero trade flows as well as firm heterogeneity in explaining bilateral trade among countries, two factors usually omitted in traditional gravity specifications found in previous literature. Last, it also proposes a economic rationale for the effects of NTM on trade flows, helping to shed some light on its main operating channels under a rather simple Cournot’s duopolistic competition framework
Resumo:
This paper employs mechanism design to study the effects of imperfect legal enforcement on optimal scale of projects, borrowing interest rates and the probability of default. The analysis departs from an environment that combines asymmetric information about cash flows and limited commitment by borrowers. Incentive for repayment comes from the possibility of liquidation of projects by a court, but courts are costly and may fail to liquidate. The value of liquidated assets can be used as collateral: it is transferred to the lender when courts liquidate. Examples reveal that costly use of courts may be optimal, which contrasts with results from most limited commitment models, where punishments are just threats, never applied in optimal arrangements. I show that when voluntary liquidation is allowed, both asymmetric information and uncertainty about courts are necessary conditions for legal punishments ever to be applied. Numerical solutions for several parametric specifications are presented, allowing for heterogeneity on initial wealth and variability of project returns. In all such solutions, wealthier individuals borrow with lower interest rates and run higher scale enterprises, which is consistent with stylized facts. The reliability of courts has a consistently positive effect on the scale of projects. However its effect on interest rates is subtler and depends essentially on the degree of curvature of the production function. Numerical results also show that the possibility of collateral seizing allows comovements of the interest rates and the probability of repayment.
Resumo:
Although formal methods can dramatically increase the quality of software systems, they have not widely been adopted in software industry. Many software companies have the perception that formal methods are not cost-effective cause they are plenty of mathematical symbols that are difficult for non-experts to assimilate. The Java Modelling Language (short for JML) Section 3.3 is an academic initiative towards the development of a common formal specification language for Java programs, and the implementation of tools to check program correctness. This master thesis work shows how JML based formal methods can be used to formally develop a privacy sensitive Java application. This is a smart card application for managing medical appointments. The application is named HealthCard. We follow the software development strategy introduced by João Pestana, presented in Section 3.4. Our work influenced the development of this strategy by providing hands-on insight on challenges related to development of a privacy sensitive application in Java. Pestana’s strategy is based on a three-step evolution strategy of software specifications, from informal ones, through semiformal ones, to JML formal specifications. We further prove that this strategy can be automated by implementing a tool that generates JML formal specifications from a welldefined subset of informal software specifications. Hence, our work proves that JML-based formal methods techniques are cost-effective, and that they can be made popular in software industry. Although formal methods are not popular in many software development companies, we endeavour to integrate formal methods to general software practices. We hope our work can contribute to a better acceptance of mathematical based formalisms and tools used by software engineers. The structure of this document is as follows. In Section 2, we describe the preliminaries of this thesis work. We make an introduction to the application for managing medical applications we have implemented. We also describe the technologies used in the development of the application. This section further illustrates the Java Card Remote Method Invocation communication model used in the medical application for the client and server applications. Section 3 introduces software correctness, including the design by contract and the concept of contract in JML. Section 4 presents the design structure of the application. Section 5 shows the implementation of the HealthCard. Section 6 describes how the HealthCard is verified and validated using JML formal methods tools. Section 7 includes some metrics of the HealthCard implementation and specification. Section 8 presents a short example of how a client-side of a smart card application can be implemented while respecting formal specifications. Section 9 describes a prototype tools to generate JML formal specifications from informal specifications automatically. Section 10 describes some challenges and main ideas came acrorss during the development of the HealthCard. The full formal specification and implementation of the HealthCard smart card application presented in this document can be reached at https://sourceforge.net/projects/healthcard/.
Resumo:
In the last decade mobile wireless communications have witnessed an explosive growth in the user’s penetration rate and their widespread deployment around the globe. It is expected that this tendency will continue to increase with the convergence of fixed Internet wired networks with mobile ones and with the evolution to the full IP architecture paradigm. Therefore mobile wireless communications will be of paramount importance on the development of the information society of the near future. In particular a research topic of particular relevance in telecommunications nowadays is related to the design and implementation of mobile communication systems of 4th generation. 4G networks will be characterized by the support of multiple radio access technologies in a core network fully compliant with the Internet Protocol (all IP paradigm). Such networks will sustain the stringent quality of service (QoS) requirements and the expected high data rates from the type of multimedia applications to be available in the near future. The approach followed in the design and implementation of the mobile wireless networks of current generation (2G and 3G) has been the stratification of the architecture into a communication protocol model composed by a set of layers, in which each one encompasses some set of functionalities. In such protocol layered model, communications is only allowed between adjacent layers and through specific interface service points. This modular concept eases the implementation of new functionalities as the behaviour of each layer in the protocol stack is not affected by the others. However, the fact that lower layers in the protocol stack model do not utilize information available from upper layers, and vice versa, downgrades the performance achieved. This is particularly relevant if multiple antenna systems, in a MIMO (Multiple Input Multiple Output) configuration, are implemented. MIMO schemes introduce another degree of freedom for radio resource allocation: the space domain. Contrary to the time and frequency domains, radio resources mapped into the spatial domain cannot be assumed as completely orthogonal, due to the amount of interference resulting from users transmitting in the same frequency sub-channel and/or time slots but in different spatial beams. Therefore, the availability of information regarding the state of radio resources, from lower to upper layers, is of fundamental importance in the prosecution of the levels of QoS expected from those multimedia applications. In order to match applications requirements and the constraints of the mobile radio channel, in the last few years researches have proposed a new paradigm for the layered architecture for communications: the cross-layer design framework. In a general way, the cross-layer design paradigm refers to a protocol design in which the dependence between protocol layers is actively exploited, by breaking out the stringent rules which restrict the communication only between adjacent layers in the original reference model, and allowing direct interaction among different layers of the stack. An efficient management of the set of available radio resources demand for the implementation of efficient and low complexity packet schedulers which prioritize user’s transmissions according to inputs provided from lower as well as upper layers in the protocol stack, fully compliant with the cross-layer design paradigm. Specifically, efficiently designed packet schedulers for 4G networks should result in the maximization of the capacity available, through the consideration of the limitations imposed by the mobile radio channel and comply with the set of QoS requirements from the application layer. IEEE 802.16e standard, also named as Mobile WiMAX, seems to comply with the specifications of 4G mobile networks. The scalable architecture, low cost implementation and high data throughput, enable efficient data multiplexing and low data latency, which are attributes essential to enable broadband data services. Also, the connection oriented approach of Its medium access layer is fully compliant with the quality of service demands from such applications. Therefore, Mobile WiMAX seems to be a promising 4G mobile wireless networks candidate. In this thesis it is proposed the investigation, design and implementation of packet scheduling algorithms for the efficient management of the set of available radio resources, in time, frequency and spatial domains of the Mobile WiMAX networks. The proposed algorithms combine input metrics from physical layer and QoS requirements from upper layers, according to the crosslayer design paradigm. Proposed schedulers are evaluated by means of system level simulations, conducted in a system level simulation platform implementing the physical and medium access control layers of the IEEE802.16e standard.
Resumo:
Tests on printed circuit boards and integrated circuits are widely used in industry,resulting in reduced design time and cost of a project. The functional and connectivity tests in this type of circuits soon began to be a concern for the manufacturers, leading to research for solutions that would allow a reliable, quick, cheap and universal solution. Initially, using test schemes were based on a set of needles that was connected to inputs and outputs of the integrated circuit board (bed-of-nails), to which signals were applied, in order to verify whether the circuit was according to the specifications and could be assembled in the production line. With the development of projects, circuit miniaturization, improvement of the production processes, improvement of the materials used, as well as the increase in the number of circuits, it was necessary to search for another solution. Thus Boundary-Scan Testing was developed which operates on the border of integrated circuits and allows testing the connectivity of the input and the output ports of a circuit. The Boundary-Scan Testing method was converted into a standard, in 1990, by the IEEE organization, being known as the IEEE 1149.1 Standard. Since then a large number of manufacturers have adopted this standard in their products. This master thesis has, as main objective: the design of Boundary-Scan Testing in an image sensor in CMOS technology, analyzing the standard requirements, the process used in the prototype production, developing the design and layout of Boundary-Scan and analyzing obtained results after production. Chapter 1 presents briefly the evolution of testing procedures used in industry, developments and applications of image sensors and the motivation for the use of architecture Boundary-Scan Testing. Chapter 2 explores the fundamentals of Boundary-Scan Testing and image sensors, starting with the Boundary-Scan architecture defined in the Standard, where functional blocks are analyzed. This understanding is necessary to implement the design on an image sensor. It also explains the architecture of image sensors currently used, focusing on sensors with a large number of inputs and outputs.Chapter 3 describes the design of the Boundary-Scan implemented and starts to analyse the design and functions of the prototype, the used software, the designs and simulations of the functional blocks of the Boundary-Scan implemented. Chapter 4 presents the layout process used based on the design developed on chapter 3, describing the software used for this purpose, the planning of the layout location (floorplan) and its dimensions, the layout of individual blocks, checks in terms of layout rules, the comparison with the final design and finally the simulation. Chapter 5 describes how the functional tests were performed to verify the design compliancy with the specifications of Standard IEEE 1149.1. These tests were focused on the application of signals to input and output ports of the produced prototype. Chapter 6 presents the conclusions that were taken throughout the execution of the work.
Resumo:
All over the world, organizations are becoming more and more complex, and there’s a need to capture its complexity, so this is when the DEMO methodology (Design and Engineering Methodology for Organizations), created and developed by Jan L. G. Dietz, reaches its potential, which is to capture the structure of business processes in a coherent and consistent form of diagrams with their respective grammatical rules. The creation of WAMM (Wiki Aided Meta Modeling) platform was the main focus of this thesis, and had like principal precursor the idea to create a Meta-Editor that supports semantic data and uses MediaWiki. This prototype Meta-Editor uses MediaWiki as a receptor of data, and uses the ideas created in the Universal Enterprise Adaptive Object Model and the concept of Semantic Web, to create a platform that suits our needs, through Semantic MediaWiki, which helps the computer interconnect information and people in a more comprehensive, giving meaning to the content of the pages. The proposed Meta-Modeling platform allows the specification of the abstract syntax i.e., the grammar, and concrete syntax, e.g., symbols and connectors, of any language, as well as their model types and diagram types. We use the DEMO language as a proofof-concept and example. All such specifications are done in a coherent and formal way by the creation of semantic wiki pages and semantic properties connecting them.
Resumo:
MAIDL, André Murbach; CARVILHE, Claudio; MUSICANTE, Martin A. Maude Object-Oriented Action Tool. Electronic Notes in Theoretical Computer Science. [S.l:s.n], 2008.