12 resultados para SUBJECTIVE TIME
em Repositório Científico do Instituto Politécnico de Lisboa - Portugal
Resumo:
Recent literature has proved that many classical pricing models (Black and Scholes, Heston, etc.) and risk measures (V aR, CV aR, etc.) may lead to “pathological meaningless situations”, since traders can build sequences of portfolios whose risk leveltends to −infinity and whose expected return tends to +infinity, i.e., (risk = −infinity, return = +infinity). Such a sequence of strategies may be called “good deal”. This paper focuses on the risk measures V aR and CV aR and analyzes this caveat in a discrete time complete pricing model. Under quite general conditions the explicit expression of a good deal is given, and its sensitivity with respect to some possible measurement errors is provided too. We point out that a critical property is the absence of short sales. In such a case we first construct a “shadow riskless asset” (SRA) without short sales and then the good deal is given by borrowing more and more money so as to invest in the SRA. It is also shown that the SRA is interested by itself, even if there are short selling restrictions.
Resumo:
The population growth of a Staphylococcus aureus culture, an active colloidal system of spherical cells, was followed by rheological measurements, under steady-state and oscillatory shear flows. We observed a rich viscoelastic behavior as a consequence of the bacteria activity, namely, of their multiplication and density-dependent aggregation properties. In the early stages of growth (lag and exponential phases), the viscosity increases by about a factor of 20, presenting several drops and full recoveries. This allows us to evoke the existence of a percolation phenomenon. Remarkably, as the bacteria reach their late phase of development, in which the population stabilizes, the viscosity returns close to its initial value. Most probably, this is caused by a change in the bacteria physiological activity and in particular, by the decrease of their adhesion properties. The viscous and elastic moduli exhibit power-law behaviors compatible with the "soft glassy materials" model, whose exponents are dependent on the bacteria growth stage. DOI: 10.1103/PhysRevE.87.030701.
Resumo:
O presente trabalho visa propor uma estratégia para a construção e lançamento de um novo modelo de negócio para a atuação das Relações Públicas em Portugal, numa proposta direcionada para as micro e pequenas empresas. Entre o serviço in house e a consultadoria clássica existe um espaço não coberto em Portugal: um serviço in house partilhado. Apresenta-se aqui este projeto de serviço de Relações Públicas para aqueles para quem é incomportável assumir nos seus quadros um Técnico de Comunicação.
Resumo:
Conferência - 16th International Symposium on Wireless Personal Multimedia Communications (WPMC)- Jun 24-27, 2013
Resumo:
Contrastando com o importante legado dos mestres organistas portugueses dos séculos XVI e XVII, a música portuguesa para órgão pós-1700 parece quase inexistente (excluindo raros exemplos, como as quatro sonatas para órgão de Carlos Seixas). Seja devido à destruição causada pelo grande terramoto de Lisboa em 1755, ou a outras causas, a ausência de fontes é surpreendente, considerando os testemunhos de actividade musical durante aquele período. Este artigo lida com uma fonte até hoje relativamente ignorada: o manuscrito CLI/1-4 nº 7 da Biblioteca do Palácio Ducal de Vila Viçosa (Versos / Sobre o Canto Chão / Para Orgão / De Fr. Jeronimo da M.dre de DS.). Esta colecção de vinte versos para órgão de Jerónimo da Madre de Deus é, de longe, a maior obra portuguesa para órgão da primeira metade do século XVIII até hoje conhecida. Claramente pensadas para o órgão, estas curtas peças testemunham a transformação da escrita para tecla em Portugal durante o reinado de D. João V (nomeadamente através da absorção de influências italianas) e fornecem informações preciosas sobre o tipo de instrumento em que eram tocadas.
Resumo:
This paper suggests an analysis of "Inanimate Alice" by Kate Pullinger, Chris Joseph and Ian Harper as an example of a transmedia narrative that triggers a new reading experience whilst proposing a literary alterity between reading and performance. Narrative experiences that elect the visual plasticity, interchanging games and tactility as drivers of the creative process are not new. Yet, narrative experiences, which have been created in the gap between reality and fiction, have found on the digital realm the perfect environment to multiple hybrid experiences. Bearing in mind Walter Benjamin’s concept of Erlebnis and Erfahrung, a critical analysis of this digital fiction tries to illustrate how literary art finds its space and time in a metamorphosed continuum only activated by the “patient reader”. All the multimedia hybrids, which this digital literary work may have, challenge readers to interpret different signals and poetic structures that most of readers might not be accustomed to; however even among a cognitive dissonance, meaning is found and reading happens only if time, space and attention are available. All possible transmedia literacies can only respond to this experience of online reading, if they are able to focus and draw attention not to a simple new behaviour or a single new practice, but to a demanding state of affairs that assemble different objective and subjective value forms.
Resumo:
Dissertação de natureza Científica para obtenção do grau de Mestre em Engenharia Civil
Resumo:
The discovery of X-rays was undoubtedly one of the greatest stimulus for improving the efficiency in the provision of healthcare services. The ability to view, non-invasively, inside the human body has greatly facilitated the work of professionals in diagnosis of diseases. The exclusive focus on image quality (IQ), without understanding how they are obtained, affect negatively the efficiency in diagnostic radiology. The equilibrium between the benefits and the risks are often forgotten. It is necessary to adopt optimization strategies to maximize the benefits (image quality) and minimize risk (dose to the patient) in radiological facilities. In radiology, the implementation of optimization strategies involves an understanding of images acquisition process. When a radiographer adopts a certain value of a parameter (tube potential [kVp], tube current-exposure time product [mAs] or additional filtration), it is essential to know its meaning and impact of their variation in dose and image quality. Without this, any optimization strategy will be a failure. Worldwide, data show that use of x-rays has been increasingly frequent. In Cabo Verde, we note an effort by healthcare institutions (e.g. Ministry of Health) in equipping radiological facilities and the recent installation of a telemedicine system requires purchase of new radiological equipment. In addition, the transition from screen-films to digital systems is characterized by a raise in patient exposure. Given that this transition is slower in less developed countries, as is the case of Cabo Verde, the need to adopt optimization strategies becomes increasingly necessary. This study was conducted as an attempt to answer that need. Although this work is about objective evaluation of image quality, and in medical practice the evaluation is usually subjective (visual evaluation of images by radiographer / radiologist), studies reported a correlation between these two types of evaluation (objective and subjective) [5-7] which accredits for conducting such studies. The purpose of this study is to evaluate the effect of exposure parameters (kVp and mAs) when using additional Cooper (Cu) filtration in dose and image quality in a Computed Radiography system.
Resumo:
Behavioral biometrics is one of the areas with growing interest within the biosignal research community. A recent trend in the field is ECG-based biometrics, where electrocardiographic (ECG) signals are used as input to the biometric system. Previous work has shown this to be a promising trait, with the potential to serve as a good complement to other existing, and already more established modalities, due to its intrinsic characteristics. In this paper, we propose a system for ECG biometrics centered on signals acquired at the subject's hand. Our work is based on a previously developed custom, non-intrusive sensing apparatus for data acquisition at the hands, and involved the pre-processing of the ECG signals, and evaluation of two classification approaches targeted at real-time or near real-time applications. Preliminary results show that this system leads to competitive results both for authentication and identification, and further validate the potential of ECG signals as a complementary modality in the toolbox of the biometric system designer.
Resumo:
Brain dopamine transporters imaging by Single Emission Tomography (SPECT) with 123I-FP-CIT (DaTScanTM) has become an important tool in the diagnosis and evaluation of Parkinson syndromes.This diagnostic method allows the visualization of a portion of the striatum – where healthy pattern resemble two symmetric commas - allowing the evaluation of dopamine presynaptic system, in which dopamine transporters are responsible for dopamine release into the synaptic cleft, and their reabsorption into the nigrostriatal nerve terminals, in order to be stored or degraded. In daily practice for assessment of DaTScan TM, it is common to rely only on visual assessment for diagnosis. However, this process is complex and subjective as it depends on the observer’s experience and it is associated with high variability intra and inter observer. Studies have shown that semiquantification can improve the diagnosis of Parkinson syndromes. For semiquantification, analysis methods of image segmentation using regions of interest (ROI) are necessary. ROIs are drawn, in specific - striatum - and in nonspecific – background – uptake areas. Subsequently, specific binding ratios are calculated. Low adherence of semiquantification for diagnosis of Parkinson syndromes is related, not only with the associated time spent, but also with the need of an adapted database of reference values for the population concerned, as well as, the examination of each service protocol. Studies have concluded, that this process increases the reproducibility of semiquantification. The aim of this investigation was to create and validate a database of healthy controls for Dopamine transporters with DaTScanTM named DBRV. The created database has been adapted to the Nuclear Medicine Department’s protocol, and the population of Infanta Cristina’s Hospital located in Badajoz, Spain.
Resumo:
ABSTRACT - Starting with the explanation of metanarrative as a sort of self-reflexive storytelling (as defended by Kenneth Weaver Hope in his unpublished PhD. thesis), I propose to talk about enunciative practices that stress the telling more than the told. In line with some metaficcional practices applied to cinema, such as the ‘mindfuck’ film (Jonathan Eig, 2003), the ‘psychological puzzle film’ (Elliot Panek, 2003) and the ‘mind-game film’ (Thomas Elsaesser, 2009), I will address the manipulations that a narrative film endures in order to produce a more fruitful and complex experience for the viewer. I will particularly concentrate on the misrepresentation of time as a way to produce a labyrinthine work of fiction where the linear description of events is replaced by a game of time disclosure. The viewer is thus called upon to reconstruct the order of the various situations portrayed in a process that I call ‘temporal mapping’. However, as the viewer attempts to do this, the film, ironically, because of the intricate nature of the plot and the uncertain status of the characters, resists the attempt. There is a sort of teasing taking place between the film and its spectator: an invitation of decoding that is half-denied until the end, where the puzzle is finally solved. I will use three of Alejandro Iñárritu’s films to better convey my point: Amores perros (2000), 21 Grams (2003) and Babel (2006). I will consider Iñárritu’s methods to produce a non-linear storytelling as a way to stress the importance of time and its validity as one of the elements that make up for a metanarrative experience in films. I will focus especially on 21 Grams, which I consider to be a paragon of the labyrinth.
Resumo:
Hyperspectral instruments have been incorporated in satellite missions, providing large amounts of data of high spectral resolution of the Earth surface. This data can be used in remote sensing applications that often require a real-time or near-real-time response. To avoid delays between hyperspectral image acquisition and its interpretation, the last usually done on a ground station, onboard systems have emerged to process data, reducing the volume of information to transfer from the satellite to the ground station. For this purpose, compact reconfigurable hardware modules, such as field-programmable gate arrays (FPGAs), are widely used. This paper proposes an FPGA-based architecture for hyperspectral unmixing. This method based on the vertex component analysis (VCA) and it works without a dimensionality reduction preprocessing step. The architecture has been designed for a low-cost Xilinx Zynq board with a Zynq-7020 system-on-chip FPGA-based on the Artix-7 FPGA programmable logic and tested using real hyperspectral data. Experimental results indicate that the proposed implementation can achieve real-time processing, while maintaining the methods accuracy, which indicate the potential of the proposed platform to implement high-performance, low-cost embedded systems, opening perspectives for onboard hyperspectral image processing.