978 resultados para reference models


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Recentemente, o mercado brasileiro regulamentou produtos de seguro de vida e previdência com garantia de uma remuneração mínima, a ser corrigida por um índice de preços e acrescida da participação em retorno de fundo específico de investimento. Esta tese vem dar sua contribuição no âmbito da crescente demanda pela avaliação econômico financeira destes produtos. Esta demanda é motivada não apenas pelas exigências regulatórias internacionais mas também pelas necessidades de real percepção dos riscos fincnaceiros envolvidos. Assim, é aqui proposto um modelo de precificação cuja finalidade é permitir a calibragem, dentro de uma condição definida de equilíbrio, dos parâmetros de garantias de um contrato - taxa mínima e participação. Estes são assim determinados em função não só da maturidade estabelecida, como também das expectativas relativas às variáveis de mercado, como taxas de juros, volatilidade dos ativos e índice de preços. O modelo proposto mostrou comportamento similar ao constatado num conjunto referencial de modelos. Por esta razão, sugere-se que ele é adequado, dentro das limitações colocadas, ao processo de precificação dos produtos em questão.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Recently regulated Brazilian life and pension products offer a benefit structure composed of minimum guaranteed annual rate, in°ation adjustment according to a price index and participation on an investment fund performance. We present a valuation model for these products. We establish a fair condition relationship between minimum guarantees and participation rates, and explore its behavior over a space of maturities, interest rates, and also fund and price index volatilities and correlation. Besides consistency to reference models, we found that the effect of the fund volatility is conditioned to the price index volatility level and the correlation between them.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Este estudo teve como finalidade comparar a morfologia e propriedades físicas da estrutura do esmalte dos dentes bovinos, bubalinos e humanos. A análise deste tecido foi realizada por meio de microscopia eletrônica de varredura, composição mineral, microdureza e rugosidade superficial do esmalte em 41 incisivos bubalinos (Bos taurus indicus), 41 incisivos bovinos (Pelorovis antiques) e 30 incisivos permanentes de humanos. Os resultados mostraram que a ultraestrutura do esmalte revela uma significativa similaridade das espécies estudadas com a encontrada em amostras humanas. No esmalte bovino e bubalino os elementos químicos que apresentaram maior concentração foram: O, Ca e P, justamente os que formam os cristais de hidroxiapatita - Ca10(PO4)6(OH)2. Na microdureza Knoop não houve diferença estatisticamente significante entre as três espécies. Porém, a rugosidade superficial do esmalte bubalino (2,16µm ±0,23) foi significativamente maior quando comparada aos dentes humano (0,36µm ±0,05) e bovino (0,41µm ±0,07). Conclui-se que as características e propriedades do esmalte bovino e bubalino, por meio de análises e testes, apresentou uma morfologia semelhante à de humanos, arquitetura ultraestrutural similar, microdureza e composição mineral equivalente ao tecido dental humano, tornando-se modelos de referência para pesquisas.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The efficient generation of digital surface model (DSM) from optical images has been explored for many years and the results are dependent on the project characteristics (image resolution, size of overlap between images, among others), of the image matching techniques and the computer capabilities for the image processing. The points generated from image matching have a direct impact on the quality of the DSM and, consequently, influence the need for the costly step of edition. This work aims at assessing experimentally a technique for DSM generation by matching of multiple images (two or more) simultaneously using the vertical line locus method (VLL). The experiments were performed with six images of the urban area of Presidente Prudente/SP, with a ground sample distance (GSD) of approximately 7cm. DSMs of a small area with homogeneous texture, repetitive pattern, moving objects including shadows and trees were generated to assess the quality of the developed procedure. This obtained DSM was compared to cloud points acquired by LASER (Light Amplification by Simulated Emission of Radiation) scanning as wells as with a DSM generated by Leica Photogrammetric Suite (LPS) software. The accomplished results showed that the MDS generated by the implemented technique has a geometric quality compatible with the reference models.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pós-graduação em Ciência da Informação - FFC

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Effects of roads on wildlife and its habitat have been measured using metrics, such as the nearest road distance, road density, and effective mesh size. In this work we introduce two new indices: (1) Integral Road Effect (IRE), which measured the sum effects of points in a road at a fixed point in the forest; and (2) Average Value of the Infinitesimal Road Effect (AVIRE), which measured the average of the effects of roads at this point. IRE is formally defined as the line integral of a special function (the infinitesimal road effect) along the curves that model the roads, whereas AVIRE is the quotient of IRE by the length of the roads. Combining tools of ArcGIS software with a numerical algorithm, we calculated these and other road and habitat cover indices in a sample of points in a human-modified landscape in the Brazilian Atlantic Forest, where data on the abundance of two groups of small mammals (forest specialists and habitat generalists) were collected in the field. We then compared through the Akaike Information Criterion (AIC) a set of candidate regression models to explain the variation in small mammal abundance, including models with our two new road indices (AVIRE and IRE) or models with other road effect indices (nearest road distance, mesh size, and road density), and reference models (containing only habitat indices, or only the intercept without the effect of any variable). Compared to other road effect indices, AVIRE showed the best performance to explain abundance of forest specialist species, whereas the nearest road distance obtained the best performance to generalist species. AVIRE and habitat together were included in the best model for both small mammal groups, that is, higher abundance of specialist and generalist small mammals occurred where there is lower average road effect (less AVIRE) and more habitat. Moreover, AVIRE was not significantly correlated with habitat cover of specialists and generalists differing from the other road effect indices, except mesh size, which allows for separating the effect of roads from the effect of habitat on small mammal communities. We suggest that the proposed indices and GIS procedures could also be useful to describe other spatial ecological phenomena, such as edge effect in habitat fragments. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We investigate the strong magnetic and gravity anomalies of the Goias Alkaline Province (GAP), a region of Late Cretaceous alkaline magmatism along the northern border of the Parana Basin, Brazil. The alkaline complexes (eight of which are present in outcrops, two others inferred from magnetic signals) are characterized by a series of small intrusions forming almost circular magnetic and gravimetric anomalies varying from -4000 to +6000 nT and from -10 to +40 mGal, respectively. We used the Aneuler method and Analytical Signal Amplitude to obtain depth and geometry for mapped sources from the magnetic anomaly data. These results were used as the reference models in the 3D gravity inversion. The 3D inversion results show that the alkaline intrusions have depths of 10-12 km. The intrusions in the northern GAP follow two alignments and have different sizes. In the anomaly magnetic map, dominant guidelines correlate strongly with the extensional regimes that correlate with the rise of alkaline magmatism. The emplacement of these intrusions marks mechanical discontinuities and zones of weakness in the upper crust. According to the 3D inversion results, those intrusions are located within the upper crust (from the surface to 18 km depth) and have spheres as the preferable geometry. Such spherical shapes are more consistent with magmatic chambers instead of plug intrusions. The Registro do Araguaia anomaly (similar to 15 by 25 km) has a particular magnetic signature that indicates that the top is deeper than 1500 m. North of this circular anomaly are lineaments with structural indices indicating contacts on their edges and dikes/sills in the interiors. Results of 3D inversion of magnetic and gravity data suggest that the Registro do Araguaia is the largest body in the area, reaching 18 km depth and indicating a circular layered structure. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Several practical obstacles in data handling and evaluation complicate the use of quantitative localized magnetic resonance spectroscopy (qMRS) in clinical routine MR examinations. To overcome these obstacles, a clinically feasible MR pulse sequence protocol based on standard available MR pulse sequences for qMRS has been implemented along with newly added functionalities to the free software package jMRUI-v5.0 to make qMRS attractive for clinical routine. This enables (a) easy and fast DICOM data transfer from the MR console and the qMRS-computer, (b) visualization of combined MR spectroscopy and imaging, (c) creation and network transfer of spectroscopy reports in DICOM format, (d) integration of advanced water reference models for absolute quantification, and (e) setup of databases containing normal metabolite concentrations of healthy subjects. To demonstrate the work-flow of qMRS using these implementations, databases for normal metabolite concentration in different regions of brain tissue were created using spectroscopic data acquired in 55 normal subjects (age range 6-61 years) using 1.5T and 3T MR systems, and illustrated in one clinical case of typical brain tumor (primitive neuroectodermal tumor). The MR pulse sequence protocol and newly implemented software functionalities facilitate the incorporation of qMRS and reference to normal value metabolite concentration data in daily clinical routine. Magn Reson Med, 2013. © 2012 Wiley Periodicals, Inc.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Currently, observations of space debris are primarily performed with ground-based sensors. These sensors have a detection limit at some centimetres diameter for objects in Low Earth Orbit (LEO) and at about two decimetres diameter for objects in Geostationary Orbit (GEO). The few space-based debris observations stem mainly from in-situ measurements and from the analysis of returned spacecraft surfaces. Both provide information about mostly sub-millimetre-sized debris particles. As a consequence the population of centimetre- and millimetre-sized debris objects remains poorly understood. The development, validation and improvement of debris reference models drive the need for measurements covering the whole diameter range. In 2003 the European Space Agency (ESA) initiated a study entitled “Space-Based Optical Observation of Space Debris”. The first tasks of the study were to define user requirements and to develop an observation strategy for a space-based instrument capable of observing uncatalogued millimetre-sized debris objects. Only passive optical observations were considered, focussing on mission concepts for the LEO, and GEO regions respectively. Starting from the requirements and the observation strategy, an instrument system architecture and an associated operations concept have been elaborated. The instrument system architecture covers the telescope, camera and onboard processing electronics. The proposed telescope is a folded Schmidt design, characterised by a 20 cm aperture and a large field of view of 6°. The camera design is based on the use of either a frame-transfer charge coupled device (CCD), or on a cooled hybrid sensor with fast read-out. A four megapixel sensor is foreseen. For the onboard processing, a scalable architecture has been selected. Performance simulations have been executed for the system as designed, focussing on the orbit determination of observed debris particles, and on the analysis of the object detection algorithms. In this paper we present some of the main results of the study. A short overview of the user requirements and observation strategy is given. The architectural design of the instrument is discussed, and the main tradeoffs are outlined. An insight into the results of the performance simulations is provided.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The electroencephalograph (EEG) signal is one of the most widely used signals in the biomedicine field due to its rich information about human tasks. This research study describes a new approach based on i) build reference models from a set of time series, based on the analysis of the events that they contain, is suitable for domains where the relevant information is concentrated in specific regions of the time series, known as events. In order to deal with events, each event is characterized by a set of attributes. ii) Discrete wavelet transform to the EEG data in order to extract temporal information in the form of changes in the frequency domain over time- that is they are able to extract non-stationary signals embedded in the noisy background of the human brain. The performance of the model was evaluated in terms of training performance and classification accuracies and the results confirmed that the proposed scheme has potential in classifying the EEG signals.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Semantics Difficulty Model (SDM) is a model that measures the difficult of introducing semantics technology into a company. SDM manages three descriptions of stages, which we will refer to as ?snapshots?: a company semantic snapshot, data snapshot and semantic application snapshot. Understanding a priory the complexity of introducing semantics into a company is important because it allows the organization to take early decisions, thus saving time and money, mitigating risks and improving innovation, time to market and productivity. SDM works by measuring the distance between each initial snapshot and its reference models (the company semantic snapshots reference model, data snapshots reference model, and the semantic application snapshots reference model) with Euclidian distances. The difficulty level will be "not at all difficult" when the distance is small, and becomes "extremely difficult" when the the distance is large. SDM has been tested experimentally with 2000 simulated companies with arrangements and several initial stages. The output is measured by five linguistic values: "not at all difficult, slightly difficult, averagely difficult, very difficult and extremely difficult". As the preliminary results of our SDM simulation model indicate, transforming a search application into integrated data from different sources with semantics is a "slightly difficult", in contrast with data and opinion extraction applications for which it is "very difficult".

Relevância:

60.00% 60.00%

Publicador:

Resumo:

El Trabajo de Fin de Grado aborda el tema del Descubrimiento de Conocimiento en series numéricas temporales, abordando el análisis de las mismas desde el punto de vista de la semántica de las series. La gran mayoría de trabajos realizados hasta la fecha en el campo del análisis de series temporales proponen el análisis numérico de los valores de la serie, lo que permite obtener buenos resultados pero no ofrece la posibilidad de formular las conclusiones de forma que se puedan justificar e interpretar los resultados obtenidos. Por ello, en este trabajo se pretende crear una aplicación que permita realizar el análisis de las series temporales desde un punto de vista cualitativo, en contraposición al tradicional método cuantitativo. De esta forma, quedarán recogidos todos los elementos relevantes de la serie temporal que puedan servir de estudio en un futuro. Para abordar el objetivo propuesto se plantea un mecanismo para extraer de la serie temporal la información que resulta de interés para su análisis. Para poder hacerlo, primero se formaliza el conjunto de comportamientos relevantes del dominio, que serán los símbolos a mostrar en la salida de la aplicación. Así, el método que se ha diseñado e implementado transformará una serie temporal numérica en una secuencia simbólica que recoge toda la semántica de la serie temporal de partida y resulta más intuitiva y fácil de interpretar. Una vez que se dispone de un mecanismo para transformar las series numéricas en secuencias simbólicas, se pueden plantear todas las tareas de análisis sobre dichas secuencias de símbolos. En este trabajo, aunque no se entra en este post-análisis de estas series, sí se plantean distintos campos en los que se puede avanzar en el futuro. Por ejemplo, se podría hacer una medida de la similitud entre dos secuencias simbólicas como punto de partida para la tarea de comparación o la creación de modelos de referencia para análisis posteriores de las series temporales. ---ABSTRACT---This Final-year Project deals with the topic of Knowledge Discovery in numerical time series, addressing time series analysis from the viewpoint of the semantics of the series. Most of the research conducted to date in the field of time series analysis recommends analysing the values of the series numerically. This provides good results but prevents the conclusions from being formulated to allow justification and interpretation of the results. Thus, the purpose of this project is to create an application that allows the analysis of time series, from a qualitative point of view rather than a quantitative one. This way, all the relevant elements of the time series will be gathered for future studies. The design of a mechanism to extract the information that is of interest from the time series is the first step towards achieving the proposed objective. To do this, all the key behaviours in the domain are set, which will be the symbols shown in the output. The designed and implemented method transforms a numerical time series into a symbolic sequence that takes in all the semantics of the original time series and is more intuitive and easier to interpret. Once a mechanism for transforming the numerical series into symbolic sequences is created, the symbolic sequences are ready for analysis. Although this project does not cover a post-analysis of these series, it proposes different fields in which research can be done in the future. For instance, comparing two different sequences to measure the similarities between them, or the creation of reference models for further analysis of time series.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The solutions to cope with new challenges that societies have to face nowadays involve providing smarter daily systems. To achieve this, technology has to evolve and leverage physical systems automatic interactions, with less human intervention. Technological paradigms like Internet of Things (IoT) and Cyber-Physical Systems (CPS) are providing reference models, architectures, approaches and tools that are to support cross-domain solutions. Thus, CPS based solutions will be applied in different application domains like e-Health, Smart Grid, Smart Transportation and so on, to assure the expected response from a complex system that relies on the smooth interaction and cooperation of diverse networked physical systems. The Wireless Sensors Networks (WSN) are a well-known wireless technology that are part of large CPS. The WSN aims at monitoring a physical system, object, (e.g., the environmental condition of a cargo container), and relaying data to the targeted processing element. The WSN communication reliability, as well as a restrained energy consumption, are expected features in a WSN. This paper shows the results obtained in a real WSN deployment, based on SunSPOT nodes, which carries out a fuzzy based control strategy to improve energy consumption while keeping communication reliability and computational resources usage among boundaries.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

En este artículo abordamos el significado de la familia adoptiva a partir del análisis del discurso de los relatos autobiográficos de madres y padres adoptivos españoles. En un contexto de vacío de cultura adoptiva, las familias adoptivas publican narraciones para ser valoradas como «normales» al tiempo que, en ausencia de modelos de referencia, definen su modelo de familia desdibujando el arquetipo familiar instituido. A partir del método biográfico, aplicamos un doble ejercicio sociológico de (1) deconstrucción ideológica del modelo de familia hegemónico a partir de la (2) construcción del significado que padres y madres adoptivas otorgan a su familia. Las teorías de la familia postmoderna y las teorías feministas postestructuralistas enmarcan el análisis crítico del discurso con perspectiva de género con el que es abordado el estudio de estos singulares documentos personales.