890 resultados para Zero sets of bivariate polynomials
Resumo:
Various significant anti-HCV and cytotoxic sesquiterpene lactones (SLs) have been characterized. In this work, the chemometric tool Principal Component Analysis (PCA) was applied to two sets of SLs and the variance of the biological activity was explored. The first principal component accounts for as much of the variability in the data as possible, and each succeeding component accounts for as much of the remaining variability as possible. The calculations were performed using VolSurf program. For anti-HCV activity, PC1 (First Principal Component) explained 30.3% and PC2 (Second Principal Component) explained 26.5% of matrix total variance, while for cytotoxic activity, PC1 explained 30.9% and PC2 explained 15.6% of the total variance. The formalism employed generated good exploratory and predictive results and we identified some structural features, for both sets, important to the suitable biological activity and pharmacokinetic profile.
Resumo:
The batch-operated bromate/phosphate/acetone/dual catalyst system was studied at four temperatures between 5 and 35 degrees C. The dynamics was simultaneously followed by potential measurements with platinum and bromide selective electrodes, and spectroscopically at two different wavelengths. By simultaneously recording these four time series it was possible to characterize the dynamics of the sequential oscillations that evolve in time. The existence of three sequential oscillatory patterns at each temperature allowed estimating the activation energies in each case. Along with the activation energy of the induction period, it was possible to trace the time evolution of the overall activation energy at four different stages as the reaction proceeds. The study was carried out for two different sets of initial concentrations and it was observed that the overall activation energy increases as reactants turn into products. This finding was propounded as a result of the decrease in the driving force, or the system`s affinity, of the catalytic oxidative bromination of acetone with acidic bromate, as the closed system evolves toward the thermodynamic equilibrium.
Resumo:
Background: The sensitivity to microenvironmental changes varies among animals and may be under genetic control. It is essential to take this element into account when aiming at breeding robust farm animals. Here, linear mixed models with genetic effects in the residual variance part of the model can be used. Such models have previously been fitted using EM and MCMC algorithms. Results: We propose the use of double hierarchical generalized linear models (DHGLM), where the squared residuals are assumed to be gamma distributed and the residual variance is fitted using a generalized linear model. The algorithm iterates between two sets of mixed model equations, one on the level of observations and one on the level of variances. The method was validated using simulations and also by re-analyzing a data set on pig litter size that was previously analyzed using a Bayesian approach. The pig litter size data contained 10,060 records from 4,149 sows. The DHGLM was implemented using the ASReml software and the algorithm converged within three minutes on a Linux server. The estimates were similar to those previously obtained using Bayesian methodology, especially the variance components in the residual variance part of the model. Conclusions: We have shown that variance components in the residual variance part of a linear mixed model can be estimated using a DHGLM approach. The method enables analyses of animal models with large numbers of observations. An important future development of the DHGLM methodology is to include the genetic correlation between the random effects in the mean and residual variance parts of the model as a parameter of the DHGLM.
Resumo:
The recent advances in CMOS technology have allowed for the fabrication of transistors with submicronic dimensions, making possible the integration of tens of millions devices in a single chip that can be used to build very complex electronic systems. Such increase in complexity of designs has originated a need for more efficient verification tools that could incorporate more appropriate physical and computational models. Timing verification targets at determining whether the timing constraints imposed to the design may be satisfied or not. It can be performed by using circuit simulation or by timing analysis. Although simulation tends to furnish the most accurate estimates, it presents the drawback of being stimuli dependent. Hence, in order to ensure that the critical situation is taken into account, one must exercise all possible input patterns. Obviously, this is not possible to accomplish due to the high complexity of current designs. To circumvent this problem, designers must rely on timing analysis. Timing analysis is an input-independent verification approach that models each combinational block of a circuit as a direct acyclic graph, which is used to estimate the critical delay. First timing analysis tools used only the circuit topology information to estimate circuit delay, thus being referred to as topological timing analyzers. However, such method may result in too pessimistic delay estimates, since the longest paths in the graph may not be able to propagate a transition, that is, may be false. Functional timing analysis, in turn, considers not only circuit topology, but also the temporal and functional relations between circuit elements. Functional timing analysis tools may differ by three aspects: the set of sensitization conditions necessary to declare a path as sensitizable (i.e., the so-called path sensitization criterion), the number of paths simultaneously handled and the method used to determine whether sensitization conditions are satisfiable or not. Currently, the two most efficient approaches test the sensitizability of entire sets of paths at a time: one is based on automatic test pattern generation (ATPG) techniques and the other translates the timing analysis problem into a satisfiability (SAT) problem. Although timing analysis has been exhaustively studied in the last fifteen years, some specific topics have not received the required attention yet. One such topic is the applicability of functional timing analysis to circuits containing complex gates. This is the basic concern of this thesis. In addition, and as a necessary step to settle the scenario, a detailed and systematic study on functional timing analysis is also presented.
Resumo:
This paper is about economies with a representative consumer. In general a representative consumer need not exist, although there are several well known sets of sufficient conditions under which Qne will. It is common practice, however, to use the representative consumer hypothesis without specifically assuming any of these. We show, firstly, that it is possible for the utility of the representative consumer to increase when every actual consumer is made worse off. This shows a serious shortcoming of welfare judgements based on the representatíve consumer. Secondly, in economies where this does not occur, there exists a social welfare function, which we construct, which is consistent with welfare judgements based on the utility of the representative consumer. Finally we provide a converse to Samuelson' s 1956 representative consumer result, which relates it to Scitovsky's community indifference curves.
Resumo:
It is shown that, for almost every two-player game with imperfect monitoring, the conclusions of the classical folk theorem are false. So, even though these games admit a well-known approximate folk theorem, an exact folk theorem may only be obtained for a measure zero set of games. A complete characterization of the efficient equilibria of almost every such game is also given, along with an inefficiency result on the imperfect monitoring prisoner s dilemma.
Resumo:
The relationship between Islamic Law and other legal systems (basically western type domestic legal orders and international law) is often thought of in terms of compatibility or incompatibility. Concerning certain subject matters of choice, the compatibility of Islamic (legal) principles with the values embedded in legal systems that are regarded as characteristic of the Modern Age is tested by sets of questions: is democracy possible in Islam? Does Islam recognize human rights and are those rights equivalent to a more universal conception? Does Islam recognize or condone more extreme acts of violence and does it justify violence differently? Etc. Such questions and many more presuppose the existence of an ensemble of rules or principles which, as any other set of rules and principles, purport to regulate social behavior. This ensemble is generically referred to as Islamic Law. However, one set of questions is usually left unanswered: is Islamic Law a legal system? If it is a legal system, what are its specific characteristics? How does it work? Where does it apply? It is this paper`s argument that the relationship between Islamic Law and domestic and international law can only be understood if looked upon as a relationship between distinct legal systems or legal orders.
Resumo:
Este trabalho apresenta como objetivo principal a construção de proposta que vise acelerar a oferta de vagas em creches na Cidade de São Paulo. Desenvolvido a partir de Termo de Referência organizado pela Secretaria Municipal de Educação de São Paulo (SME), tem como objeto a política pública de creches nesse município, que atende a cerca de 190 mil crianças, mas enfrenta o expressivo déficit de aproximadamente 127 mil vagas frente a um universo de 736 mil crianças entre 0 e 3 anos. À luz da base teórica e empírica da área da gestão e políticas públicas, contextualizamos o problema e indicamos alternativas para a expansão. Inicialmente, buscou-se percorrer a contextualização histórica e o marco legal referente à provisão de creches. Com base em entrevistas com especialistas e levantamento de dados secundários, foi avaliado o atual desenho e implementação da política de creches. A partir desse diagnóstico, identificamos como principais gargalos a serem superados: (i) a aceleração da expansão da quantidade de vagas em creches; (ii) a equalização da distribuição da expansão de acordo com a vulnerabilidade socioeconômica; (iii) o aumento da qualidade do atendimento das entidades e redução da desigualdade de padrões de qualidade entre as modalidades direta e conveniada; (iv) o aprimoramento da capacidade gerencial e de planejamento da SME; e (v) a constituição de visão integrada da política de primeira infância para o município de São Paulo. A proposta sugere, portanto, investimentos para ampliação da capacidade gerencial da SME – incluindo o potencial de articulação com outras entidades e atores –, mudanças no atual desenho da política de creches, e ferramentais de planejamento, acompanhamento e avaliação da política. Foi elaborada, para fins ilustrativos, a simulação da concretização de planejamento estratégico para a política de creches, que abrange dois conjuntos de medidas alternativas para a expansão de vagas e suas necessidades orçamentárias associadas.
Resumo:
The work described in this thesis aims to support the distributed design of integrated systems and considers specifically the need for collaborative interaction among designers. Particular emphasis was given to issues which were only marginally considered in previous approaches, such as the abstraction of the distribution of design automation resources over the network, the possibility of both synchronous and asynchronous interaction among designers and the support for extensible design data models. Such issues demand a rather complex software infrastructure, as possible solutions must encompass a wide range of software modules: from user interfaces to middleware to databases. To build such structure, several engineering techniques were employed and some original solutions were devised. The core of the proposed solution is based in the joint application of two homonymic technologies: CAD Frameworks and object-oriented frameworks. The former concept was coined in the late 80's within the electronic design automation community and comprehends a layered software environment which aims to support CAD tool developers, CAD administrators/integrators and designers. The latter, developed during the last decade by the software engineering community, is a software architecture model to build extensible and reusable object-oriented software subsystems. In this work, we proposed to create an object-oriented framework which includes extensible sets of design data primitives and design tool building blocks. Such object-oriented framework is included within a CAD Framework, where it plays important roles on typical CAD Framework services such as design data representation and management, versioning, user interfaces, design management and tool integration. The implemented CAD Framework - named Cave2 - followed the classical layered architecture presented by Barnes, Harrison, Newton and Spickelmier, but the possibilities granted by the use of the object-oriented framework foundations allowed a series of improvements which were not available in previous approaches: - object-oriented frameworks are extensible by design, thus this should be also true regarding the implemented sets of design data primitives and design tool building blocks. This means that both the design representation model and the software modules dealing with it can be upgraded or adapted to a particular design methodology, and that such extensions and adaptations will still inherit the architectural and functional aspects implemented in the object-oriented framework foundation; - the design semantics and the design visualization are both part of the object-oriented framework, but in clearly separated models. This allows for different visualization strategies for a given design data set, which gives collaborating parties the flexibility to choose individual visualization settings; - the control of the consistency between semantics and visualization - a particularly important issue in a design environment with multiple views of a single design - is also included in the foundations of the object-oriented framework. Such mechanism is generic enough to be also used by further extensions of the design data model, as it is based on the inversion of control between view and semantics. The view receives the user input and propagates such event to the semantic model, which evaluates if a state change is possible. If positive, it triggers the change of state of both semantics and view. Our approach took advantage of such inversion of control and included an layer between semantics and view to take into account the possibility of multi-view consistency; - to optimize the consistency control mechanism between views and semantics, we propose an event-based approach that captures each discrete interaction of a designer with his/her respective design views. The information about each interaction is encapsulated inside an event object, which may be propagated to the design semantics - and thus to other possible views - according to the consistency policy which is being used. Furthermore, the use of event pools allows for a late synchronization between view and semantics in case of unavailability of a network connection between them; - the use of proxy objects raised significantly the abstraction of the integration of design automation resources, as either remote or local tools and services are accessed through method calls in a local object. The connection to remote tools and services using a look-up protocol also abstracted completely the network location of such resources, allowing for resource addition and removal during runtime; - the implemented CAD Framework is completely based on Java technology, so it relies on the Java Virtual Machine as the layer which grants the independence between the CAD Framework and the operating system. All such improvements contributed to a higher abstraction on the distribution of design automation resources and also introduced a new paradigm for the remote interaction between designers. The resulting CAD Framework is able to support fine-grained collaboration based on events, so every single design update performed by a designer can be propagated to the rest of the design team regardless of their location in the distributed environment. This can increase the group awareness and allow a richer transfer of experiences among them, improving significantly the collaboration potential when compared to previously proposed file-based or record-based approaches. Three different case studies were conducted to validate the proposed approach, each one focusing one a subset of the contributions of this thesis. The first one uses the proxy-based resource distribution architecture to implement a prototyping platform using reconfigurable hardware modules. The second one extends the foundations of the implemented object-oriented framework to support interface-based design. Such extensions - design representation primitives and tool blocks - are used to implement a design entry tool named IBlaDe, which allows the collaborative creation of functional and structural models of integrated systems. The third case study regards the possibility of integration of multimedia metadata to the design data model. Such possibility is explored in the frame of an online educational and training platform.
Resumo:
In this dissertation, we investigate the effect of foreign capital participations in Brazilians companies’ performance. To carry out this analysis, we constructed two sets of model based on EBITDA margin and return on equity. Panel data analysis is used to examine the relationship between foreign capital ownership and Brazilian firms’ performance. We construct a cross-section time-series sample of companies listed on the BOVESPA index from 2006 to 2010. Empirical results led us to validate two hypotheses. First, foreign capital participations improve companies’ performance up to a certain level of participation. Then, joint controlled or strategic partnership between a Brazilian company and a foreign investor provide high operating performance.
Resumo:
Uma importante tendência do mercado de luxo é a extensão de marca em um novo segmento de mercado por meio da chamada extensão vertical, que pode ser para cima ou para baixo. Em outras palavras, significa que a organização passa a atuar em um novo segmento dentro de uma mesma categoria de produtos, mas com diferente público-alvo que sua marca original. Nesse processo, a empresa inicia atividade em um novo segmento com diferente nível de luxo. A distribuição é um aspecto fundamental do composto de Marketing e a importância da internet como canal de distribuição dessa indústria tem aumentado expressivamente nos últimos anos. Dessa forma, faz-se necessário compreender como as marcas de luxo gerenciam suas estratégias de distribuição online quando desenvolvem processos de extensão de marca e penetração em novos segmentos. Com o objetivo de analisar a estratégia de distribuição da indústria de luxo, um estudo exploratório foi desenvolvido focando bens de luxo pessoal (em categorias como costura, relógios & jóias, couro e sapatos). Uma amostra significativa constituída de marcas originais e suas extensões foi analisada para constituir um modelo comparativo entre duas variáveis: o nível de diferenciação entre os canais de distribuição da marca original e suas extensões; e a distância entre as próprias marcas no que concerne ao seus posicionamentos. Esse estudo contribui para o entendimento da dinâmica de distribuição do mercado e colabora com a compreensão do comportamento das empresas que atuam nele, dependendo do tipo de extensões que elas desenvolvem e da forma como elas são conduzidas.
Resumo:
This paper analyzes the placement in the private sector of a subset of Brazilian public-sector employees. This group left public employment in the mid-1990’s through a voluntary severance program. This paper contrasts their earnings before and after quitting the public sector, and compares both sets of wages to public and private sector earnings for similar workers. We find that participants in this voluntary severance program suffered a significant reduction in average earnings wage and an increase in earnings dispersion. We test whether the reduction in average earnings and the increase in earnings dispersion is the expected outcome once one controls for observed characteristics, by means of counterfactual simulations. Several methods of controlling for observed characteristics (parametric and non-parametrically) are used for robustness. The results indicate that this group of workers was paid at levels below what would be expected given their embodied observable characteristics.
Resumo:
The master thesis for the achievement of the academic status master of science in international management (MPGI) will aim to solve the research question of how institutional voids affect the entry decision-making process of foreign venture capital firms coming to Brazil. This is a timely matter since in the past years there has been a sudden eruption of foreign VC involvement in Brazil. Based on the actionable framework by Khanna and Palpeu (2010) we conducted quantitative as well as qualitative research with two sets of interview partners in a two-phase analysis. We interviewed experts from VC firms, foreign VC firms based in Brazil and perspective VC firms that are looking to come to Brazil. We started with the former, derived lessons learned and analyzed how they affect the latter in reaching a decision. As we expected we found that depending on the industry that ventures are in, institutional voids can either pose an opportunity or a threat and hence attract or push away potential VC firms entering Brazil. Opportunities exist especially when exploiting institutional voids, for example through ventures in the marketplace efficiency. Threats are posed by investments in for instance hard infrastructure, where the economic, political and judicial systems as well as corruption and bureaucracy play demanding roles.
Resumo:
A new paradigm is modeling the World: evolutionary innovations in all fronts, new information technologies, huge mobility of capital, use of risky financial tools, globalization of production, new emerging powers and the impact of consumer concerns on governmental policies. These phenomena are shaping the World and forcing the advent of a new World Order in the Multilateral Monetary, Financial, and Trading System. The effects of this new paradigm are also transforming global governance. The political and economic orders established after the World War and centered on the multilateral model of UN, IMF, World Bank, and the GATT, leaded by the developed countries, are facing significant challenges. The rise of China and emerging countries shifted the old model to a polycentric World, where the governance of these organizations are threatened by emerging countries demanding a bigger participation in the role and decision boards of these international bodies. As a consequence, multilateralism is being confronted by polycentrism. Negotiations for a more representative voting process and the pressure for new rules to cope with the new demands are paralyzing important decisions. This scenario is affecting seriously not only the Monetary and Financial Systems but also the Multilateral Trading System. International trade is facing some significant challenges: a serious deadlock to conclude the last round of the multilateral negotiation at the WTO, the fragmentation of trade rules by the multiplication of preferential and mega agreements, the arrival of a new model of global production and trade leaded by global value chains that is threatening the old trade order, and the imposition of new sets of regulations by private bodies commanded by transnationals to support global value chains and non-governmental organizations to reflect the concerns of consumers in the North based on their precautionary attitude about sustainability of products made in the World. The lack of any multilateral order in this new regulation is creating a big cacophony of rules and developing a new regulatory war of the Global North against the Global South. The objective of this paper is to explore how these challenges are affecting the Tradinge System and how it can evolve to manage these new trends.
Resumo:
The financial crisis and Great Recession have been followed by a jobs shortage crisis that most forecasts predict will persist for years given current policies. This paper argues for a wage-led recovery and growth program which is the only way to remedy the deep causes of the crisis and escape the jobs crisis. Such a program is the polar opposite of the current policy orthodoxy, showing how much is at stake. Winning the argument for wage-led recovery will require winning the war of ideas about economics that has its roots going back to Keynes’ challenge of classical macroeconomics in the 1920s and 1930s. That will involve showing how the financial crisis and Great Recession were the ultimate result of three decades of neoliberal policy, which produced wage stagnation by severing the wage productivity growth link and made asset price inflation and debt the engine of demand growth in place of wages; showing how wage-led policy resolves the current problem of global demand shortage without pricing out labor; and developing a detailed set of policy proposals that flow from these understandings. The essence of a wage-led policy approach is to rebuild the link between wages and productivity growth, combined with expansionary macroeconomic policy that fills the current demand shortfall so as to push the economy on to a recovery path. Both sets of measures are necessary. Expansionary macro policy (i.e. fiscal stimulus and easy monetary policy) without rebuilding the wage mechanism will not produce sustainable recovery and may end in fiscal crisis. Rebuilding the wage mechanism without expansionary macro policy is likely to leave the economy stuck in the orbit of stagnation.