912 resultados para Bias-Variance Trade-off
Resumo:
This paper discusses the target localization problem in wireless visual sensor networks. Additive noises and measurement errors will affect the accuracy of target localization when the visual nodes are equipped with low-resolution cameras. In the goal of improving the accuracy of target localization without prior knowledge of the target, each node extracts multiple feature points from images to represent the target at the sensor node level. A statistical method is presented to match the most correlated feature point pair for merging the position information of different sensor nodes at the base station. Besides, in the case that more than one target exists in the field of interest, a scheme for locating multiple targets is provided. Simulation results show that, our proposed method has desirable performance in improving the accuracy of locating single target or multiple targets. Results also show that the proposed method has a better trade-off between camera node usage and localization accuracy.
Resumo:
It is sometimes thought that the choice between Molinism and open theism involves a trade-off in values: Molinism asserts that God has providential power but allows God indirectly to manipulate that in virtue of which human beings are to be judged; while open theism grants human beings more power over that in virtue of which they are tp be judged, but at the price of giving up providence. I argue here that this picture is misconstrued---that Molinism gives human agents more power over that in virtue of which they may be judged than does open theism. Since open theism confines the possible avenues for evaluating agents to their behavior in the actual world, open theism is incompatible with any solution to the problem of moral luck which appeals to counterfactual behavior, and so (I argue) is impugned by the problem,. Molinists, by contrast, have a promising solution to that problem.
Resumo:
As várias teorias acerca da estrutura de capital despertam interesse motivando diversos estudos sobre o assunto sem, no entanto, ter um consenso. Outro tema aparentemente pouco explorado refere-se ao ciclo de vida das empresas e como ele pode influenciar a estrutura de capital. Este estudo teve como objetivo verificar quais determinantes possuem maior relevância no endividamento das empresas e se estes determinantes alteram-se dependendo do ciclo de vida da empresa apoiada pelas teorias Trade Off, Pecking Order e Teoria da Agência. Para alcançar o objetivo deste trabalho foi utilizado análise em painel de efeito fixo sendo a amostra composta por empresas brasileiras de capital aberto, com dados secundários disponíveis na Economática® no período de 2005 a 2013, utilizando-se os setores da BM&FBOVESPA. Como resultado principal destaca-se o mesmo comportamento entre a amostra geral, alto e baixo crescimento pelo endividamento contábil para o determinante Lucratividade apresentando uma relação negativa, e para os determinantes Oportunidade de Crescimento e Tamanho, estes com uma relação positiva. Para os grupos de alto e baixo crescimento alguns determinantes apresentaram resultados diferentes, como a singularidade que resultou significância nestes dois grupos, sendo positiva no baixo crescimento e negativa no alto crescimento, para o valor colateral dos ativos e benefício fiscal não dívida apresentaram significância apenas no grupo de baixo crescimento. Para o endividamento a valor de mercado foi observado significância para o Benefício fiscal não dívida e Singularidade. Este resultado reforça o argumento de que o ciclo de vida influência a estrutura de capital.
Resumo:
A cultura de uma organização é importante para que seus colaboradores possuam mesmos objetivos e valores. Porém, em busca de manter uma estratégia competitiva e agir de forma responsável na comunidade em que se encontra, a empresa precisa inovar e adaptar-se. A gestão da diversidade apresenta-se como uma válida forma de enfrentar estes novos desafios e exigências. Contudo, há muitos obstáculos para que uma gestão da diversidade seja bem-sucedida. Este estudo propõe-se em entender como a diversidade pode afetar a cultura de uma organização. Além disso, como a cultura pode afetar a estratégia de gestão da diversidade. Foi utilizado para este objetivo um estudo de caso em profundidade com seis entrevistas. As entrevistas foram apoiadas em um roteiro semi estruturado. A análise dos dados obtidos foi feita pela análise de conteúdo. De acordo com a pesquisa realizada, sugere que a cultura organizacional e a gestão da diversidade estão diretamente conectadas e que podem influenciar positivamente uma a outra, porém precisam manter um equilíbrio em suas ações para que não causem prejuízos à organização.
Resumo:
Many problems in human society reflect the inability of selfish parties to cooperate. The “Iterated Prisoner’s Dilemma” has been used widely as a model for the evolution of cooperation in societies. Axelrod’s computer tournaments and the extensive simulations of evolution by Nowak and Sigmund and others have shown that natural selection can favor cooperative strategies in the Prisoner’s Dilemma. Rigorous empirical tests, however, lag behind the progress made by theorists. Clear predictions differ depending on the players’ capacity to remember previous rounds of the game. To test whether humans use the kind of cooperative strategies predicted, we asked students to play the iterated Prisoner’s Dilemma game either continuously or interrupted after each round by a secondary memory task (i.e., playing the game “Memory”) that constrained the students’ working-memory capacity. When playing without interruption, most students used “Pavlovian” strategies, as predicted, for greater memory capacity, and the rest used “generous tit-for-tat” strategies. The proportion of generous tit-for-tat strategies increased when games of Memory interfered with the subjects’ working memory, as predicted. Students who continued to use complex Pavlovian strategies were less successful in the Memory game, but more successful in the Prisoner’s Dilemma, which indicates a trade-off in memory capacity for the two tasks. Our results suggest that the set of strategies predicted by game theorists approximates human reality.
Resumo:
Objective: To determine how patients with lung cancer value the trade off between the survival benefit of chemotherapy and its toxicities.
Resumo:
The visual world is presented to the brain through patterns of action potentials in the population of optic nerve fibers. Single-neuron recordings show that each retinal ganglion cell has a spatially restricted receptive field, a limited integration time, and a characteristic spectral sensitivity. Collectively, these response properties define the visual message conveyed by that neuron's action potentials. Since the size of the optic nerve is strictly constrained, one expects the retina to generate a highly efficient representation of the visual scene. By contrast, the receptive fields of nearby ganglion cells often overlap, suggesting great redundancy among the retinal output signals. Recent multineuron recordings may help resolve this paradox. They reveal concerted firing patterns among ganglion cells, in which small groups of nearby neurons fire synchronously with delays of only a few milliseconds. As there are many more such firing patterns than ganglion cells, such a distributed code might allow the retina to compress a large number of distinct visual messages into a small number of optic nerve fibers. This paper will review the evidence for a distributed coding scheme in the retinal output. The performance limits of such codes are analyzed with simple examples, illustrating that they allow a powerful trade-off between spatial and temporal resolution.
Resumo:
A metrópole de São Paulo é a maior e mais importante aglomeração urbana do Brasil e está entre as dez maiores áreas urbanas do mundo. No entanto, a forma como acessibilidade espacial ocorre gera um fardo para a população e para a atividade econômica. Este trabalho pretende contribuir para a discussão de como melhorar a acessibilidade na Região Metropolitana de São Paulo estudando as características e impactos de estruturas espaciais urbana, analisando criticamente a estrutura espacial da metrópole e proporcionando sugestões de melhorias a fim de proporcionar uma mobilidade mais sustentável. Os procedimentos metodológicos incluem uma revisão bibliográfica sobre o tema e uma caracterização da estrutura espacial da Região Metropolitana de São Paulo, considerando a alocação de população, alocação de empregos e os padrões de deslocamento para os modais individual, coletivo e não motorizado. Apresentamos um relato da evolução recente, com dados das pesquisas de origem e destino realizadas pelo Metrô em 1997 e 2007 e da pesquisa de mobilidade de 2012. Também realizamos uma caracterização mais aprofundada com os dados da pesquisa de 2007. As cidades se desenvolvem com base no trade-off entre proximidade e mobilidade: a fim de maximizar as possibilidades de interação, as pessoas e as empresas tendem a se localizar onde o deslocamento necessário para executar essas interações requer menos custos financeiros, perda de tempo e desconforto. Esse processo molda a alocação espacial de atividades, que define parcialmente os hábitos de transporte. A estrutura espacial urbana pode ser caracterizada por sua escala (padrões compacto ou disperso), arranjo de densidades (padrão disperso ou clusterizado) e arranjo de atividade (padrão monocêntrico ou policêntrico). Estruturas espaciais com padrão mais compacto apresentam menores distâncias de viagem, reduzindo o impacto ambiental das viagens e viabilizando o transporte não motorizado e coletivo, e levam a um uso mais eficiente da terra, menor custo de infraestrutura e maior equidade no acesso ao transporte. Já estruturas clusterizadas policêntricas são associadas com maior facilidade de acesso à terra. Existe um debate sobre a capacidade de estruturas policêntricas resultarem em uma aproximação generalizada de empregos e residências. A Região Metropolitana de São Paulo apresenta um padrão monocêntrico na escala metropolitana, com fortes movimentos pendulares da periferia para o centro expandido da iii capital. Durante o período de análise, foi observada uma realocação da população para áreas mais centrais da cidade e uma centralização dos empregos ainda mais forte, resultando no agravamento dos movimentos pendulares. Existe uma clara divisão modal por renda: as classes mais altas utilizam majoritariamente automóveis, enquanto as classes mais baixas utilizam majoritariamente transporte coletivo e não motorizado. Para o futuro, o novo plano diretor tem o mérito de caminhar na direção do desenvolvimento urbano orientado pelo transporte sustentável, porém os níveis de densidade máxima permitidos ainda são parecidos com o do plano anterior e a largura dos eixos de adensamento é restrita. Acreditamos ser vantajoso um aumento do adensamento em áreas próximas dos empregos; geração de polos de adensamento em áreas mais afastadas dos empregos, mas próximas das infraestruturas de transporte coletivo de alta velocidade, e desencorajamento do adensamento em áreas com baixa acessibilidade. Também é necessária uma gestão integrada dos transportes, provendo infraestrutura para viagens não motorizadas e viagens intermodais, e uma gestão dos impactos negativos do adensamento.
Resumo:
As the user base of the Internet has grown tremendously, the need for secure services has increased accordingly. Most secure protocols, in digital business and other fields, use a combination of symmetric and asymmetric cryptography, random generators and hash functions in order to achieve confidentiality, integrity, and authentication. Our proposal is an integral security kernel based on a powerful mathematical scheme from which all of these cryptographic facilities can be derived. The kernel requires very little resources and has the flexibility of being able to trade off speed, memory or security; therefore, it can be efficiently implemented in a wide spectrum of platforms and applications, either software, hardware or low cost devices. Additionally, the primitives are comparable in security and speed to well known standards.
Resumo:
Prototype Selection (PS) algorithms allow a faster Nearest Neighbor classification by keeping only the most profitable prototypes of the training set. In turn, these schemes typically lower the performance accuracy. In this work a new strategy for multi-label classifications tasks is proposed to solve this accuracy drop without the need of using all the training set. For that, given a new instance, the PS algorithm is used as a fast recommender system which retrieves the most likely classes. Then, the actual classification is performed only considering the prototypes from the initial training set belonging to the suggested classes. Results show that this strategy provides a large set of trade-off solutions which fills the gap between PS-based classification efficiency and conventional kNN accuracy. Furthermore, this scheme is not only able to, at best, reach the performance of conventional kNN with barely a third of distances computed, but it does also outperform the latter in noisy scenarios, proving to be a much more robust approach.
Resumo:
La partición hardware/software es una etapa clave dentro del proceso de co-diseño de los sistemas embebidos. En esta etapa se decide qué componentes serán implementados como co-procesadores de hardware y qué componentes serán implementados en un procesador de propósito general. La decisión es tomada a partir de la exploración del espacio de diseño, evaluando un conjunto de posibles soluciones para establecer cuál de estas es la que mejor balance logra entre todas las métricas de diseño. Para explorar el espacio de soluciones, la mayoría de las propuestas, utilizan algoritmos metaheurísticos; destacándose los Algoritmos Genéticos, Recocido Simulado. Esta decisión, en muchos casos, no es tomada a partir de análisis comparativos que involucren a varios algoritmos sobre un mismo problema. En este trabajo se presenta la aplicación de los algoritmos: Escalador de Colinas Estocástico y Escalador de Colinas Estocástico con Reinicio, para resolver el problema de la partición hardware/software. Para validar el empleo de estos algoritmos se presenta la aplicación de este algoritmo sobre un caso de estudio, en particular la partición hardware/software de un codificador JPEG. En todos los experimentos es posible apreciar que ambos algoritmos alcanzan soluciones comparables con las obtenidas por los algoritmos utilizados con más frecuencia.
Resumo:
This thesis investigates the design of optimal tax systems in dynamic environments. The first essay characterizes the optimal tax system where wages depend on stochastic shocks and work experience. In addition to redistributive and efficiency motives, the taxation of inexperienced workers depends on a second-best requirement that encourages work experience, a social insurance motive and incentive effects. Calibrations using U.S. data yield higher expected optimal marginal income tax rates for experienced workers for most of the inexperienced workers. They confirm that the average marginal income tax rate increases (decreases) with age when shocks and work experience are substitutes (complements). Finally, more variability in experienced workers' earnings prospects leads to increasing tax rates since income taxation acts as a social insurance mechanism. In the second essay, the properties of an optimal tax system are investigated in a dynamic private information economy where labor market frictions create unemployment that destroys workers' human capital. A two-skill type model is considered where wages and employment are endogenous. I find that the optimal tax system distorts the first-period wages of all workers below their efficient levels which leads to more employment. The standard no-distortion-at-the-top result no longer holds due to the combination of private information and the destruction of human capital. I show this result analytically under the Maximin social welfare function and confirm it numerically for a general social welfare function. I also investigate the use of a training program and job creation subsidies. The final essay analyzes the optimal linear tax system when there is a population of individuals whose perceptions of savings are linked to their disposable income and their family background through family cultural transmission. Aside from the standard equity/efficiency trade-off, taxes account for the endogeneity of perceptions through two channels. First, taxing labor decreases income, which decreases the perception of savings through time. Second, taxation on savings corrects for the misperceptions of workers and thus savings and labor decisions. Numerical simulations confirm that behavioral issues push labor income taxes upward to finance saving subsidies. Government transfers to individuals are also decreased to finance those same subsidies.
Resumo:
This paper posits that the Nordic countries were able to ensure good standards of equality for its citizens, while at the same time maintaining decent levels of economic growth. This can be attributed to the Nordic countries’ more holistic approach towards social spending and their focus on uplifting the skill levels of its workforce. Thus, the notion that there must be a trade-off between economic performance and a more aggressive welfare regime should be examined more thoroughly. The debate for policy makers should perhaps be framed with regard to where the balance should be between growth and equity rather than a trade-off. Firstly, the paper will elaborate on what exactly the “Nordic model” is, based on a broad literature review. Next, the paper will unpack the key characteristics of the Nordic model and analyse if indeed expansive welfare provided through state support erodes work ethic and impact the economic competitiveness of countries. Next, the paper will provide an explanation for how the balance between economic and social objectives is maintained, in some of the Nordic countries. Lastly, the paper discusses whether the same balance can be achieved in Singapore.
Resumo:
This paper assesses the effectiveness of the Meroni doctrine in the light of the recent judgment in the ESMA case. The first part explains in detail the problem of delegation of powers in the EU from the perspective of the principal-agent theory and complements it with the analysis of the trade-off between different levels of independence and accountability of agencies. A simple economic model is developed to illustrated the relationship between the independence and accountability of an agency. It shows that it is the accountability mechanism that induces the agent to act, rather than the extent of his independence. The paper also explains the inter-temporal interactions between the principal and the agent on the basis of the incentives in place for the different players. The second part is devoted to analysis of the functioning of ESMA in the context of its delegated powers. After the presentation of main aspects of the regulatory framework establishing ESMA, the paper continuous with an analysis and interpretation of the discretionary powers of ESMA. The rather rigid position of the Court of Justice in relation to the Meroni doctrine seems to be unsuitable to delegation of complex regulatory tasks. This is particularly evident in the case of financial markets. Finally, the judgment does not examine in any detail whether and how the principals - i.e. the EU and Member States - are best able to evaluate the quality of ESMA decisions and regulations and whether there are different but more effective accountability mechanisms.
Resumo:
Following a seminar on the CAP post- 2013 held by Egmont - with the cooperation of the Polish Presidency - on the 25th of November 2011, Egmont commissioned the present policy brief. Three major policy issues were addressed at this occasion, namely; how to make the CAP more equitable, green and market-oriented? The trade-off between these policy issues will require policy choices that are worthy of analysis.