832 resultados para Guarantees
Resumo:
Os mecanismos e técnicas do domínio de Tempo-Real são utilizados quando existe a necessidade de um sistema, seja este um sistema embutido ou de grandes dimensões, possuir determinadas características que assegurem a qualidade de serviço do sistema. Os Sistemas de Tempo-Real definem-se assim como sistemas que possuem restrições temporais rigorosas, que necessitam de apresentar altos níveis de fiabilidade de forma a garantir em todas as instâncias o funcionamento atempado do sistema. Devido à crescente complexidade dos sistemas embutidos, empregam-se frequentemente arquiteturas distribuídas, onde cada módulo é normalmente responsável por uma única função. Nestes casos existe a necessidade de haver um meio de comunicação entre estes, de forma a poderem comunicar entre si e cumprir a funcionalidade desejadas. Devido à sua elevada capacidade e baixo custo a tecnologia Ethernet tem vindo a ser alvo de estudo, com o objetivo de a tornar num meio de comunicação com a qualidade de serviço característica dos sistemas de tempo-real. Como resposta a esta necessidade surgiu na Universidade de Aveiro, o Switch HaRTES, o qual possui a capacidade de gerir os seus recursos dinamicamente, de modo a fornecer à rede onde é aplicado garantias de Tempo-Real. No entanto, para uma arquitetura de rede ser capaz de fornecer aos seus nós garantias de qualidade serviço, é necessário que exista uma especificação do fluxo, um correto encaminhamento de tráfego, reserva de recursos, controlo de admissão e um escalonamento de pacotes. Infelizmente, o Switch HaRTES apesar de possuir todas estas características, não suporta protocolos standards. Neste documento é apresentado então o trabalho que foi desenvolvido para a integração do protocolo SRP no Switch HaRTES.
Resumo:
This thesis deals with tensor completion for the solution of multidimensional inverse problems. We study the problem of reconstructing an approximately low rank tensor from a small number of noisy linear measurements. New recovery guarantees, numerical algorithms, non-uniform sampling strategies, and parameter selection algorithms are developed. We derive a fixed point continuation algorithm for tensor completion and prove its convergence. A restricted isometry property (RIP) based tensor recovery guarantee is proved. Probabilistic recovery guarantees are obtained for sub-Gaussian measurement operators and for measurements obtained by non-uniform sampling from a Parseval tight frame. We show how tensor completion can be used to solve multidimensional inverse problems arising in NMR relaxometry. Algorithms are developed for regularization parameter selection, including accelerated k-fold cross-validation and generalized cross-validation. These methods are validated on experimental and simulated data. We also derive condition number estimates for nonnegative least squares problems. Tensor recovery promises to significantly accelerate N-dimensional NMR relaxometry and related experiments, enabling previously impractical experiments. Our methods could also be applied to other inverse problems arising in machine learning, image processing, signal processing, computer vision, and other fields.
Resumo:
The financial crisis of 2007-2008 led to extraordinary government intervention in firms and markets. The scope and depth of government action rivaled that of the Great Depression. Many traded markets experienced dramatic declines in liquidity leading to the existence of conditions normally assumed to be promptly removed via the actions of profit seeking arbitrageurs. These extreme events motivate the three essays in this work. The first essay seeks and fails to find evidence of investor behavior consistent with the broad 'Too Big To Fail' policies enacted during the crisis by government agents. Only in limited circumstances, where government guarantees such as deposit insurance or U.S. Treasury lending lines already existed, did investors impart a premium to the debt security prices of firms under stress. The second essay introduces the Inflation Indexed Swap Basis (IIS Basis) in examining the large differences between cash and derivative markets based upon future U.S. inflation as measured by the Consumer Price Index (CPI). It reports the consistent positive value of this measure as well as the very large positive values it reached in the fourth quarter of 2008 after Lehman Brothers went bankrupt. It concludes that the IIS Basis continues to exist due to limitations in market liquidity and hedging alternatives. The third essay explores the methodology of performing debt based event studies utilizing credit default swaps (CDS). It provides practical implementation advice to researchers to address limited source data and/or small target firm sample size.
Resumo:
La conformidad procesal es una de las manifestaciones más importantes del principio de oportunidad. Éste, en relación constante con el principio de legalidad, tiende a la agilización y el recorte de los trámites procesales, sin perjuicio de las garantías esenciales del procedimiento penal. Debido a sus ventajas se ha instalado tanto en el modelo norteamericano, a través del «plea bargaining», como en el modelo continental europeo.
Resumo:
Personal information is increasingly gathered and used for providing services tailored to user preferences, but the datasets used to provide such functionality can represent serious privacy threats if not appropriately protected. Work in privacy-preserving data publishing targeted privacy guarantees that protect against record re-identification, by making records indistinguishable, or sensitive attribute value disclosure, by introducing diversity or noise in the sensitive values. However, most approaches fail in the high-dimensional case, and the ones that don’t introduce a utility cost incompatible with tailored recommendation scenarios. This paper aims at a sensible trade-off between privacy and the benefits of tailored recommendations, in the context of privacy-preserving data publishing. We empirically demonstrate that significant privacy improvements can be achieved at a utility cost compatible with tailored recommendation scenarios, using a simple partition-based sanitization method.
Resumo:
The model of autonomy developed by the Aland Isles can provide a number of interesting solutions applicable in other territories. Territorial autonomy as a manner of ensuring the political and economic rights of the minority involves facing up to the challenges of European integration and globalization. It seems that the Aland Isles have successfully coped with this challenge. Firstly, they were able to present and promote their own interests during the accession negotiations in an efficient manner. Secondly, they maintained (and additionally strengthened by including it in the aquis communitaire) their separate, autonomous status and the guarantees of identity protection by virtue of limiting the rights of persons without domicile rights to purchase land and run business activity. Thirdly, they managed to obtain a special status excluding them from the process of indirect tax harmonization, thus ensuring considerable economic benefits. Fourthly, both Finland and the European Union confirmed their autonomy, demilitarization and neutrality allowing the Isles to retain their former status under the new circumstances. Fifthly, they obtained representation in the Committee of the Regions and a defined position on European matters in Finland. The skillful application of the existing solutions and the winning of an advantageous set of derogations and exceptions strengthened the position of the Isles both with respect to Finland and the international surroundings. The Isles’ economic, cultural and political protection was augmented. Alongside their participation in international organizations, such as The Nordic Board, the Aland Isles have remained active and discernible on the international arena.
Resumo:
This dissertation investigates the connection between spectral analysis and frame theory. When considering the spectral properties of a frame, we present a few novel results relating to the spectral decomposition. We first show that scalable frames have the property that the inner product of the scaling coefficients and the eigenvectors must equal the inverse eigenvalues. From this, we prove a similar result when an approximate scaling is obtained. We then focus on the optimization problems inherent to the scalable frames by first showing that there is an equivalence between scaling a frame and optimization problems with a non-restrictive objective function. Various objective functions are considered, and an analysis of the solution type is presented. For linear objectives, we can encourage sparse scalings, and with barrier objective functions, we force dense solutions. We further consider frames in high dimensions, and derive various solution techniques. From here, we restrict ourselves to various frame classes, to add more specificity to the results. Using frames generated from distributions allows for the placement of probabilistic bounds on scalability. For discrete distributions (Bernoulli and Rademacher), we bound the probability of encountering an ONB, and for continuous symmetric distributions (Uniform and Gaussian), we show that symmetry is retained in the transformed domain. We also prove several hyperplane-separation results. With the theory developed, we discuss graph applications of the scalability framework. We make a connection with graph conditioning, and show the in-feasibility of the problem in the general case. After a modification, we show that any complete graph can be conditioned. We then present a modification of standard PCA (robust PCA) developed by Cand\`es, and give some background into Electron Energy-Loss Spectroscopy (EELS). We design a novel scheme for the processing of EELS through robust PCA and least-squares regression, and test this scheme on biological samples. Finally, we take the idea of robust PCA and apply the technique of kernel PCA to perform robust manifold learning. We derive the problem and present an algorithm for its solution. There is also discussion of the differences with RPCA that make theoretical guarantees difficult.
Resumo:
Guaraíras lagoon, located in Tibau do Sul in the eastern littoral of Rio Grande do Norte (Brazil), presents a permanent connection to the sea, which guarantees the occurrence of a rich biodiversity, which includes the autochthonous shrimp species Litopenaeus schmitti, Farfantepenaeus subtilis and Farfantepenaeus brasiliensis. In spite of being subject to a strong human intervention in the last decade, mainly related to the installation of shrimp (Litopenaeus vannamei) farms, the lagoon is still scarcely studied. The present study aims at characterizing the populations of the three autochthonous penaeid shrimp species inhabiting Guaraíras, taking into consideration their abundance and seasonal distribution in the inflow channel of Primar System of Organic Aquaculture (Tibau do Sul, Rio Grande do Norte, Brazil). Twelve monthly samples were carried out from May 2005 to April 2006 with the aid of a circular cast net in the inflow channel, which is daily supplied with water from Guaraíras. Sampling months were grouped in trimesters according to the total pluviosity, thus comprising four trimesters. Water salinity was monitored twice a week and temperature values registered on a daily basis at noon, during the study period. The daily pluviosity data from the municipality of Tibau do Sul were supplied by Empresa de Pesquisa Agropecuária do Rio Grande do Norte (EMPARN). Collected shrimp were identified, weighted, measured and sexed. L. schmitti specimens (0.2 g to 17.8 g) were distributed in 1.3 g weight classes intervals. From the eighth sampling month (December 2005) onwards, males were classified into three categories, in accordance with the development of their petasm: (a) rudimentary petasm, (b) partially formed petasm, and (c) completely formed petasm. Among the ecological variables, rainfall showed the greatest dispersion (s.d.=187.74Rainfall and abundance of L. schmitti were negatively correlated (r = -0.85) whereas its abundance and water salinity were positively correlated (r = 0.63). Among 1,144 collected individuals, 1,127 were L. schmitti, 13 were F. subtilis and 4 were F. brasiliensis, which corresponded to 98.51%, 1.14% and 0.35% of the total of collected individuals. L. schmitti occurred in 100 % of all samples. Differently, the presence of F. subtilis and F. brasiliensis was restricted to 33% and 17% of the collected samples, respectively. The present study confirmed the occurrence of L. schmitti, F. brasiliensis and F. subtilis in Guaraíras. However, this lagoon seems to be primarily inhabited by juvenile Litopenaeus schmitti. The population of L. schmitti analysed showed a seasonal pattern of distribution. In general, in the months of high salinity and absence of rain, the number of individuals was higher than in the wet months. Further studies on the reproductive biology and ecology of L. schmitti, F. brasiliensis and F. subtilis may elucidate questions referring to the abundance, period, and phase of occurrence of these shrimp genera in Guaraíras. Finally, the risks associated to the establishment of L. vannamei in the lagoon provide a novel outlet for studies in this biotope
Resumo:
Human operators are unique in their decision making capability, judgment and nondeterminism. Their sense of judgment, unpredictable decision procedures, susceptibility to environmental elements can cause them to erroneously execute a given task description to operate a computer system. Usually, a computer system is protected against some erroneous human behaviors by having necessary safeguard mechanisms in place. But some erroneous human operator behaviors can lead to severe or even fatal consequences especially in safety critical systems. A generalized methodology that can allow modeling and analyzing the interactions between computer systems and human operators where the operators are allowed to deviate from their prescribed behaviors will provide a formal understanding of the robustness of a computer system against possible aberrant behaviors by its human operators. We provide several methodology for assisting in modeling and analyzing human behaviors exhibited while operating computer systems. Every human operator is usually given a specific recommended set of guidelines for operating a system. We first present process algebraic methodology for modeling and verifying recommended human task execution behavior. We present how one can perform runtime monitoring of a computer system being operated by a human operator for checking violation of temporal safety properties. We consider the concept of a protection envelope giving a wider class of behaviors than those strictly prescribed by a human task that can be tolerated by a system. We then provide a framework for determining whether a computer system can maintain its guarantees if the human operators operate within their protection envelopes. This framework also helps to determine the robustness of the computer system under weakening of the protection envelopes. In this regard, we present a tool called Tutela that assists in implementing the framework. We then examine the ability of a system to remain safe under broad classes of variations of the prescribed human task. We develop a framework for addressing two issues. The first issue is: given a human task specification and a protection envelope, will the protection envelope properties still hold under standard erroneous executions of that task by the human operators? In other words how robust is the protection envelope? The second issue is: in the absence of a protection envelope, can we approximate a protection envelope encompassing those standard erroneous human behaviors that can be safely endured by the system? We present an extension of Tutela that implements this framework. The two frameworks mentioned above use Concurrent Game Structures (CGS) as models for both computer systems and their human operators. However, there are some shortcomings of this formalism for our uses. We add incomplete information concepts in CGSs to achieve better modularity for the players. We introduce nondeterminism in both the transition system and strategies of players and in the modeling of human operators and computer systems. Nondeterministic action strategies for players in \emph{i}ncomplete information \emph{N}ondeterministic CGS (iNCGS) is a more precise formalism for modeling human behaviors exhibited while operating a computer system. We show how we can reason about a human behavior satisfying a guarantee by providing a semantics of Alternating Time Temporal Logic based on iNCGS player strategies. In a nutshell this dissertation provides formal methodology for modeling and analyzing system robustness against both expected and erroneous human operator behaviors.
Resumo:
El presente trabajo debe insertarse en la actividad efectuada por el Grupo de Investigación del sistema universitario vasco IT 546-10.
Resumo:
A Reserva Natural do Paul de Arzila (Decreto Lei nº 219/88 de 27 de Junho) está integrada na Rede Europeia de Reservas Biogen ticas do Concelho da Europa desde 1990. A Reserva Natural do Paul de Arzila goza de um estatuto privilegiado pelo que o planeamento da área em questão está sujeito aos ditames do Concelho da Europa que garante o equilírio biológico e, consequentemente, a conservação da diversidade genética da Reserva. Impõe-se assim a necessidade de se proceder à definição dos padrões químicos e geológicos naturais não só da área da Reserva Natural do Paul de Arzila, mas também da sua envolvente. Inicialmente procedeu-se, a uma caracterização da área do Paul de Arzila referente à fisiografia, relevo, geologia, tectónica, unidades pedológicas e capacidade de uso do solo, recursos naturais, focos de poluição e seus impactos e caracterização sócio - económica. A caracterização geoquímica do Paul de Arzila foi estabelecida com base nos resultados das análises químicas efectuadas em amostras de solos, sedimentos de corrente e águas. Trata-se de um projecto que, a longo prazo, permitirá alargar a problemática da conservação ambiental e da diversidade gené tica das Reserva existentes ao nível do público local e nacional com a implementação de projectos de preservação e sensibilização ambiental.
Resumo:
Secure Multi-party Computation (MPC) enables a set of parties to collaboratively compute, using cryptographic protocols, a function over their private data in a way that the participants do not see each other's data, they only see the final output. Typical MPC examples include statistical computations over joint private data, private set intersection, and auctions. While these applications are examples of monolithic MPC, richer MPC applications move between "normal" (i.e., per-party local) and "secure" (i.e., joint, multi-party secure) modes repeatedly, resulting overall in mixed-mode computations. For example, we might use MPC to implement the role of the dealer in a game of mental poker -- the game will be divided into rounds of local decision-making (e.g. bidding) and joint interaction (e.g. dealing). Mixed-mode computations are also used to improve performance over monolithic secure computations. Starting with the Fairplay project, several MPC frameworks have been proposed in the last decade to help programmers write MPC applications in a high-level language, while the toolchain manages the low-level details. However, these frameworks are either not expressive enough to allow writing mixed-mode applications or lack formal specification, and reasoning capabilities, thereby diminishing the parties' trust in such tools, and the programs written using them. Furthermore, none of the frameworks provides a verified toolchain to run the MPC programs, leaving the potential of security holes that can compromise the privacy of parties' data. This dissertation presents language-based techniques to make MPC more practical and trustworthy. First, it presents the design and implementation of a new MPC Domain Specific Language, called Wysteria, for writing rich mixed-mode MPC applications. Wysteria provides several benefits over previous languages, including a conceptual single thread of control, generic support for more than two parties, high-level abstractions for secret shares, and a fully formalized type system and operational semantics. Using Wysteria, we have implemented several MPC applications, including, for the first time, a card dealing application. The dissertation next presents Wys*, an embedding of Wysteria in F*, a full-featured verification oriented programming language. Wys* improves on Wysteria along three lines: (a) It enables programmers to formally verify the correctness and security properties of their programs. As far as we know, Wys* is the first language to provide verification capabilities for MPC programs. (b) It provides a partially verified toolchain to run MPC programs, and finally (c) It enables the MPC programs to use, with no extra effort, standard language constructs from the host language F*, thereby making it more usable and scalable. Finally, the dissertation develops static analyses that help optimize monolithic MPC programs into mixed-mode MPC programs, while providing similar privacy guarantees as the monolithic versions.
Resumo:
This dissertation provides a novel theory of securitization based on intermediaries minimizing the moral hazard that insiders can misuse assets held on-balance sheet. The model predicts how intermediaries finance different assets. Under deposit funding, the moral hazard is greatest for low-risk assets that yield sizable returns in bad states of nature; under securitization, it is greatest for high-risk assets that require high guarantees and large reserves. Intermediaries thus securitize low-risk assets. In an extension, I identify a novel channel through which government bailouts exacerbate the moral hazard and reduce total investment irrespective of the funding mode. This adverse effect is stronger under deposit funding, implying that intermediaries finance more risky assets off-balance sheet. The dissertation discusses the implications of different forms of guarantees. With explicit guarantees, banks securitize assets with either low information-intensity or low risk. By contrast, with implicit guarantees, banks only securitize assets with high information-intensity and low risk. Two extensions to the benchmark static and dynamic models are discussed. First, an extension to the static model studies the optimality of tranching versus securitization with guarantees. Tranching eliminates agency costs but worsens adverse selection, while securitization with guarantees does the opposite. When the quality of underlying assets in a certain security market is sufficiently heterogeneous, and when the highest quality assets are perceived to be sufficiently safe, securitization with guarantees dominates tranching. Second, in an extension to the dynamic setting, the moral hazard of misusing assets held on-balance sheet naturally gives rise to the moral hazard of weak ex-post monitoring in securitization. The use of guarantees reduces the dependence of banks' ex-post payoffs on monitoring efforts, thereby weakening monitoring incentives. The incentive to monitor under securitization with implicit guarantees is the weakest among all funding modes, as implicit guarantees allow banks to renege on their monitoring promises without being declared bankrupt and punished.
Resumo:
Dada a crescente complexidade da relação fisco/contribuinte, das questões de natureza económica que lhe estão subjacentes, do peso que a fiscalidade tem nas mesmas, da sua relação com o direito, seja no âmbito do exercício do poder de autoridade em sede fiscal por parte do Estado, seja quanto à salvaguarda do exercício das garantias dos contribuintes, das relações que toda esta matéria motiva a montante e a jusante e da forma como a mesma interage, a temática da simplificação, da qualidade e dos custos indexados tem vindo gradualmente a merecer particular atenção por parte de governantes, profissionais e estudiosos da matéria. Assim, no âmbito do tema que nos propomos apresentar, atentos o vasto universo de situações possíveis e passíveis de enquadrar e considerar neste trabalho, cuja abordagem aqui se pretende explanar, será nosso objectivo centrarmo-nos no essencial, nas situações decorrentes das medidas de simplificação que têm vindo a ser implementadas ao nível da Administração Fiscal, em resultado da prática e dos procedimentos adoptados pela Direcção Geral dos Impostos (DGCI), no âmbito da implementação de medidas de política delineadas a nível governamental e cujos objectivos se têm vindo a consubstanciar na pretendida redução de custos de cumprimento e de custos de administração. Por consequência, está em equação a matéria relativa aos denominados custos de contexto face a assumidos objectivos ao nível da pretendida melhoria da qualidade no serviço a prestar ao cidadão contribuinte por parte da DGCI e do reforço da competitividade fiscal, afigurando-se também interessante abordar um ponto diferente nesta temática e que resulta do risco associado à implementação das referidas medidas, tanto na óptica do sujeito administrado como da própria administração, e à noção que o mesmo incorpora em sede de auditoria tributária e dos custos daí resultantes. ABSTRACT: Due to the growing complexity of the relation between tax authority and the taxpayer, the underlying economical questions, the importance of the fiscal issues, its relation with law, be in the context of the exercise of the power of authority in fiscal field by the State, be in the subject of the guarantees of the taxpayers, the relations as for which all these matter causes and in the form as it interacts, the issue of simplification, of quality and of indexed costs has been coming to deserve gradually particular attention by rulers, professionals and scholars. Attentive to the vast universe of possibilities that fit the theme and that could be considered in this work, our objective will be centered in the situations resulting :from the measures of simplification that have been coming to be implemented at the level of the Fiscal Administration, resulting from the practice and the proceedings adopted by the Direcção-Geral dos Impostos (DGCI), in the context of the implementation of policies outlined at a government level and whose objectives are in line with the pretended lessening of accomplishment and administration costs. Consequently, the matter in equation is the so-called costs of context considering the assumed objectives at the level of the pretended improvement of the quality in the service to the taxpayer by the DGCI and the reinforcement of the fiscal competitiveness, seeming also interesting to analise the risk associated to the implementation of the above-mentioned measures, in the point of view of the administered subject as of the Administration itself, and in the underlying notion in the tax auditing area and the resultant costs.
Resumo:
Secure computation involves multiple parties computing a common function while keeping their inputs private, and is a growing field of cryptography due to its potential for maintaining privacy guarantees in real-world applications. However, current secure computation protocols are not yet efficient enough to be used in practice. We argue that this is due to much of the research effort being focused on generality rather than specificity. Namely, current research tends to focus on constructing and improving protocols for the strongest notions of security or for an arbitrary number of parties. However, in real-world deployments, these security notions are often too strong, or the number of parties running a protocol would be smaller. In this thesis we make several steps towards bridging the efficiency gap of secure computation by focusing on constructing efficient protocols for specific real-world settings and security models. In particular, we make the following four contributions: - We show an efficient (when amortized over multiple runs) maliciously secure two-party secure computation (2PC) protocol in the multiple-execution setting, where the same function is computed multiple times by the same pair of parties. - We improve the efficiency of 2PC protocols in the publicly verifiable covert security model, where a party can cheat with some probability but if it gets caught then the honest party obtains a certificate proving that the given party cheated. - We show how to optimize existing 2PC protocols when the function to be computed includes predicate checks on its inputs. - We demonstrate an efficient maliciously secure protocol in the three-party setting.