960 resultados para Probabilidade de default
Resumo:
For the past fifty years, the interest in issues beyond pure philology has been a watchword in comparative literary studies. Comparative studies, which by default employ a variety of methods, run the major risk – as the experience of American comparative literature shows – of descending into dangerous ‘everythingism’ or losing its identity. However, it performs well when literature remains one of the segments of comparison. In such instances, it proves efficacious in exploring the ‘correspondences of arts’, the problems of identity and multiculturalism as well as contributes to the research into the transfer of ideas. Hence, it delves into phenomena which exist on the borderlines of literature, fine arts and other fields of humanities, employing strategies of interpretation which are typical for each of those fields. This means that in the process there emerges a “borderline methodology”, whose distinctive feature is heterogeneity of conducting research. This, in turn, requires the scholar to be both ingenious and creative while selecting topics as well as to possess competence in literary studies and the related field.
Resumo:
Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Ciências Farmacêuticas
Resumo:
Trabalho de Projeto apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Terapêutica da Fala, área de especialização em Linguagem no Adulto
Resumo:
Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Ciências Farmacêuticas
Resumo:
The Internet has brought unparalleled opportunities for expanding availability of research by bringing down economic and physical barriers to sharing. The digitally networked environment promises to democratize access, carry knowledge beyond traditional research niches, accelerate discovery, encourage new and interdisciplinary approaches to ever more complex research challenges, and enable new computational research strategies. However, despite these opportunities for increasing access to knowledge, the prices of scholarly journals have risen sharply over the past two decades, often forcing libraries to cancel subscriptions. Today even the wealthiest institutions cannot afford to sustain all of the journals needed by their faculties and students. To take advantage of the opportunities created by the Internet and to further their mission of creating, preserving, and disseminating knowledge, many academic institutions are taking steps to capture the benefits of more open research sharing. Colleges and universities have built digital repositories to preserve and distribute faculty scholarly articles and other research outputs. Many individual authors have taken steps to retain the rights they need, under copyright law, to allow their work to be made freely available on the Internet and in their institutionâ s repository. And, faculties at some institutions have adopted resolutions endorsing more open access to scholarly articles. Most recently, on February 12, 2008, the Faculty of Arts and Sciences (FAS) at Harvard University took a landmark step. The faculty voted to adopt a policy requiring that faculty authors send an electronic copy of their scholarly articles to the universityâ s digital repository and that faculty authors automatically grant copyright permission to the university to archive and to distribute these articles unless a faculty member has waived the policy for a particular article. Essentially, the faculty voted to make open access to the results of their published journal articles the default policy for the Faculty of Arts and Sciences of Harvard University. As of March 2008, a proposal is also under consideration in the University of California system by which faculty authors would commit routinely to grant copyright permission to the university to make copies of the facultyâ s scholarly work openly accessible over the Internet. Inspired by the example set by the Harvard faculty, this White Paper is addressed to the faculty and administrators of academic institutions who support equitable access to scholarly research and knowledge, and who believe that the institution can play an important role as steward of the scholarly literature produced by its faculty. This paper discusses both the motivation and the process for establishing a binding institutional policy that automatically grants a copyright license from each faculty member to permit deposit of his or her peer-reviewed scholarly articles in institutional repositories, from which the works become available for others to read and cite.
Resumo:
Recent measurement based studies reveal that most of the Internet connections are short in terms of the amount of traffic they carry (mice), while a small fraction of the connections are carrying a large portion of the traffic (elephants). A careful study of the TCP protocol shows that without help from an Active Queue Management (AQM) policy, short connections tend to lose to long connections in their competition for bandwidth. This is because short connections do not gain detailed knowledge of the network state, and therefore they are doomed to be less competitive due to the conservative nature of the TCP congestion control algorithm. Inspired by the Differentiated Services (Diffserv) architecture, we propose to give preferential treatment to short connections inside the bottleneck queue, so that short connections experience less packet drop rate than long connections. This is done by employing the RIO (RED with In and Out) queue management policy which uses different drop functions for different classes of traffic. Our simulation results show that: (1) in a highly loaded network, preferential treatment is necessary to provide short TCP connections with better response time and fairness without hurting the performance of long TCP connections; (2) the proposed scheme still delivers packets in FIFO manner at each link, thus it maintains statistical multiplexing gain and does not misorder packets; (3) choosing a smaller default initial timeout value for TCP can help enhance the performance of short TCP flows, however not as effectively as our scheme and at the risk of congestion collapse; (4) in the worst case, our proposal works as well as a regular RED scheme, in terms of response time and goodput.
Resumo:
Overlay networks have been used for adding and enhancing functionality to the end-users without requiring modifications in the Internet core mechanisms. Overlay networks have been used for a variety of popular applications including routing, file sharing, content distribution, and server deployment. Previous work has focused on devising practical neighbor selection heuristics under the assumption that users conform to a specific wiring protocol. This is not a valid assumption in highly decentralized systems like overlay networks. Overlay users may act selfishly and deviate from the default wiring protocols by utilizing knowledge they have about the network when selecting neighbors to improve the performance they receive from the overlay. This thesis goes against the conventional thinking that overlay users conform to a specific protocol. The contributions of this thesis are threefold. It provides a systematic evaluation of the design space of selfish neighbor selection strategies in real overlays, evaluates the performance of overlay networks that consist of users that select their neighbors selfishly, and examines the implications of selfish neighbor and server selection to overlay protocol design and service provisioning respectively. This thesis develops a game-theoretic framework that provides a unified approach to modeling Selfish Neighbor Selection (SNS) wiring procedures on behalf of selfish users. The model is general, and takes into consideration costs reflecting network latency and user preference profiles, the inherent directionality in overlay maintenance protocols, and connectivity constraints imposed on the system designer. Within this framework the notion of user’s "best response" wiring strategy is formalized as a k-median problem on asymmetric distance and is used to obtain overlay structures in which no node can re-wire to improve the performance it receives from the overlay. Evaluation results presented in this thesis indicate that selfish users can reap substantial performance benefits when connecting to overlay networks composed of non-selfish users. In addition, in overlays that are dominated by selfish users, the resulting stable wirings are optimized to such great extent that even non-selfish newcomers can extract near-optimal performance through naïve wiring strategies. To capitalize on the performance advantages of optimal neighbor selection strategies and the emergent global wirings that result, this thesis presents EGOIST: an SNS-inspired overlay network creation and maintenance routing system. Through an extensive measurement study on the deployed prototype, results presented in this thesis show that EGOIST’s neighbor selection primitives outperform existing heuristics on a variety of performance metrics, including delay, available bandwidth, and node utilization. Moreover, these results demonstrate that EGOIST is competitive with an optimal but unscalable full-mesh approach, remains highly effective under significant churn, is robust to cheating, and incurs minimal overheads. This thesis also studies selfish neighbor selection strategies for swarming applications. The main focus is on n-way broadcast applications where each of n overlay user wants to push its own distinct file to all other destinations as well as download their respective data files. Results presented in this thesis demonstrate that the performance of our swarming protocol for n-way broadcast on top of overlays of selfish users is far superior than the performance on top of existing overlays. In the context of service provisioning, this thesis examines the use of distributed approaches that enable a provider to determine the number and location of servers for optimal delivery of content or services to its selfish end-users. To leverage recent advances in virtualization technologies, this thesis develops and evaluates a distributed protocol to migrate servers based on end-users demand and only on local topological knowledge. Results under a range of network topologies and workloads suggest that the performance of the distributed deployment is comparable to that of the optimal but unscalable centralized deployment.
Resumo:
Memories in Adaptive Resonance Theory (ART) networks are based on matched patterns that focus attention on those portions of bottom-up inputs that match active top-down expectations. While this learning strategy has proved successful for both brain models and applications, computational examples show that attention to early critical features may later distort memory representations during online fast learning. For supervised learning, biased ARTMAP (bARTMAP) solves the problem of over-emphasis on early critical features by directing attention away from previously attended features after the system makes a predictive error. Small-scale, hand-computed analog and binary examples illustrate key model dynamics. Twodimensional simulation examples demonstrate the evolution of bARTMAP memories as they are learned online. Benchmark simulations show that featural biasing also improves performance on large-scale examples. One example, which predicts movie genres and is based, in part, on the Netflix Prize database, was developed for this project. Both first principles and consistent performance improvements on all simulation studies suggest that featural biasing should be incorporated by default in all ARTMAP systems. Benchmark datasets and bARTMAP code are available from the CNS Technology Lab Website: http://techlab.bu.edu/bART/.
Resumo:
We numerically investigate a novel 40 Gbps OOK to AMI all-optical modulation format converter employing an SOA-based Mach-Zehnder interferometer. We demonstrate operation with a 27-1 PRBS and explain the phase modulation's relationship with patterning.
Resumo:
We describe a 42.6 Gbit/s all-optical pattern recognition system which uses semiconductor optical amplifiers (SOAs). A circuit with three SOA-based logic gates is used to identify the presence of specific port numbers in an optical packet header.
Resumo:
Skeleton is a high‐speed Winter Olympic sport performed on the same twisting, downhill ice tracks used for Bobsleigh & Luge. The single rider sprints and pushes their sled for 20‐30m on a level start section before loading and going through a twisting course of over 1km, at speeds up to 140km/h, experiencing up to 5g. In competition, the top athletes can be within a fraction of a second of each other. The initial short pushing period is believed to be critical to overall performance but it is not well understood. A collaborative project between University of Bath, UK Sport and Tyndall National Institute is instrumenting skeleton athletes, training equipment and test tracks with Tyndall’s Wireless Inertial Measurement Unit technology in order to investigate and improve understanding of this phase of a skeleton run. It is hoped this will lead to improved training regimes and better performance of such elite, Olympic level athletes. This work presents an initial look at the system as implemented and data recorded.
Resumo:
Obesity has been defined as a consequence of energy imbalance, where energy intake exceeds energy expenditure and results in a build-up of adipose tissue. However, this scientific definition masks the complicated social meanings associated with the condition. This research investigated the construction of meaning around obesity at various levels of inquiry to inform how obesity is portrayed and understood in Ireland. A multi-paradigmatic approach was adopted, drawing on theory and methods from psychology and sociology and an analytical framework combining the Common Sense Model and framing theory was employed. In order to examine the exo-level meanings of obesity, content analysis was performed on two media data sets (n=479, n=346) and a thematic analysis was also performed on the multiple newspaper sample (n=346). At the micro-level, obesity discourses were investigated via the thematic analysis of comments sampled from an online message board. Finally, an online survey assessed individual-level beliefs and understandings of obesity. The media analysis revealed that individual blame for obesity was pervasive and the behavioural frame was dominant. A significant increase in attention to obesity over time was observed, manifestations of weight stigma were common, and there was an emotive discourse of blame directed towards the parents of obese children. The micro-level analysis provided insight into the weight-based stigma in society and a clear set of negative ‘default’ judgements accompanied the obese label. The survey analysis confirmed that the behavioural frame was the dominant means of understanding obesity. One of the strengths of this thesis is the link created between framing and the Common Sense Model in the development of an analytical framework for application in the examination of health/illness representations. This approach helped to ascertain the extent of the pervasive biomedical and individual blame discourse on obesity, which establishes the basis for the stigmatisation of obese persons.
Resumo:
Nearly one billion smart mobile devices are now used for a growing number of tasks, such as browsing the web and accessing online services. In many communities, such devices are becoming the platform of choice for tasks traditionally carried out on a personal computer. However, despite the advances, these devices are still lacking in resources compared to their traditional desktop counterparts. Mobile cloud computing is seen as a new paradigm that can address the resource shortcomings in these devices with the plentiful computing resources of the cloud. This can enable the mobile device to be used for a large range of new applications hosted in the cloud that are too resource demanding to run locally. Bringing these two technologies together presents various difficulties. In this paper, we examine the advantages of the mobile cloud and the new approaches to applications it enables. We present our own solution to create a positive user experience for such applications and describe how it enables these applications.
Resumo:
BACKGROUND: The superior colliculus (SC) has been shown to play a crucial role in the initiation and coordination of eye- and head-movements. The knowledge about the function of this structure is mainly based on single-unit recordings in animals with relatively few neuroimaging studies investigating eye-movement related brain activity in humans. METHODOLOGY/PRINCIPAL FINDINGS: The present study employed high-field (7 Tesla) functional magnetic resonance imaging (fMRI) to investigate SC responses during endogenously cued saccades in humans. In response to centrally presented instructional cues, subjects either performed saccades away from (centrifugal) or towards (centripetal) the center of straight gaze or maintained fixation at the center position. Compared to central fixation, the execution of saccades elicited hemodynamic activity within a network of cortical and subcortical areas that included the SC, lateral geniculate nucleus (LGN), occipital cortex, striatum, and the pulvinar. CONCLUSIONS/SIGNIFICANCE: Activity in the SC was enhanced contralateral to the direction of the saccade (i.e., greater activity in the right as compared to left SC during leftward saccades and vice versa) during both centrifugal and centripetal saccades, thereby demonstrating that the contralateral predominance for saccade execution that has been shown to exist in animals is also present in the human SC. In addition, centrifugal saccades elicited greater activity in the SC than did centripetal saccades, while also being accompanied by an enhanced deactivation within the prefrontal default-mode network. This pattern of brain activity might reflect the reduced processing effort required to move the eyes toward as compared to away from the center of straight gaze, a position that might serve as a spatial baseline in which the retinotopic and craniotopic reference frames are aligned.
Resumo:
Gaussian factor models have proven widely useful for parsimoniously characterizing dependence in multivariate data. There is a rich literature on their extension to mixed categorical and continuous variables, using latent Gaussian variables or through generalized latent trait models acommodating measurements in the exponential family. However, when generalizing to non-Gaussian measured variables the latent variables typically influence both the dependence structure and the form of the marginal distributions, complicating interpretation and introducing artifacts. To address this problem we propose a novel class of Bayesian Gaussian copula factor models which decouple the latent factors from the marginal distributions. A semiparametric specification for the marginals based on the extended rank likelihood yields straightforward implementation and substantial computational gains. We provide new theoretical and empirical justifications for using this likelihood in Bayesian inference. We propose new default priors for the factor loadings and develop efficient parameter-expanded Gibbs sampling for posterior computation. The methods are evaluated through simulations and applied to a dataset in political science. The models in this paper are implemented in the R package bfa.