985 resultados para Unidade de cálculo com pipeline
Resumo:
El objetivo de la presente tesis es establecer una metodología apropiada que logre determinar el valor de la prima ajustada por riesgo que debe aportar una IFI, la cual deberá tener la connotación de estar fijada entre 0 y 3,5 por mil anual. Rango definido por la Ley de Creación de la Red de Seguridad Financiera. Para desarrollar la metodología se ha considerado pertinente dividir el estudio en cuatro capítulos: El primero tiene como propósito revisar los hechos históricos de mayor relevancia acaecidos entre 1995 y 1998, referentes al ámbito económico - político y financiero del país, a fin de entender las razones e identificar las posibles alertas de riesgo por las cuales varias instituciones financieras se liquidaron. Así como también la normativa expedida en este tema. En el segundo capítulo. Sobre la base de los indicadores financieros desarrollados se procedió a agruparlos conforme el método CAMEL; y, se determinó el impacto de éstos en la salud financiera de la IFI. Posteriormente se estableció una calificación cualitativa definida como vulnerabilidad financiera. La finalidad del tercer capítulo es el de categorizar a una IFI de acuerdo a su mayor o menor nivel de vulnerabilidad financiera, para lo cual se seleccionaron como variables dependientes la calificación de riesgos entregada por las empresas calificadoras a la Superintendencia de Bancos y Seguros y, la calificación realizada por el autor de la presente tesis; y, como variables independientes los indicadores CAMEL y variables macroeconómicas. En el cuarto capítulo se definió el valor de la prima ajustada por riesgo. Valor resultante de aplicar la probabilidad obtenida a través del modelo logístico y de la relación número de depositantes para la población total.
Resumo:
Esta tesis analiza la normativa tributaria actual respecto al cálculo del Impuesto a la Renta para sociedades en el Ecuador y tiene como objetivo mostrar los efectos negativos de la misma respecto a la aplicación de un sin número de deducciones y exenciones para determinar el impuesto, además de la complejidad que conlleva su cálculo. Como resultado del análisis se entrega a manera de ejercicio académico una propuesta que busca disminuir los efectos negativos del actual sistema de cálculo, además de lograr una simplicidad en la administración del impuesto y ahorro de recursos tanto para la Administración Tributaria como para los sujetos pasivos, así como un incremento en la recaudación que a su vez contribuirá a la satisfacción de necesidades de la colectividad.
Resumo:
El presente estudio está dirigido al análisis del cálculo de volatilidad de la norma de liquidez estructural del sistema financiero ecuatoriano, tomando en consideración, que hace más de una década que se emitió la respectiva normativa, no ha sufrido cambios de fondo, no obstante de que el riesgo de liquidez al que se encuentran expuestas las instituciones financieras, y en particular la banca privada en el Ecuador materia de este estudio, es el de mayor sensibilidad e impacto en la sociedad. Dentro de este contexto, se realizó un análisis del marco teórico de la volatilidad, con la finalidad de identificar, las metodologías de valor en riesgo utilizadas, la forma de cálculo aplicado en países vecinos como Perú y Colombia y, la simulación de diferentes escenarios de horizontes de tiempo e intervalos utilizados para el cálculo de rendimientos, esto último, con el propósito de justificar un mayor requerimiento de liquidez, en resguardo de las amenazas a las que se encuentra expuesta la banca privada, por el retiro de recursos de los depositantes y de eventos que le pueden impactar adversamente del entorno económico; adicionalmente, se analizan las distribuciones de frecuencias de las series de activos líquidos y fuentes de fondeo para el período enero 2007 a septiembre 2014, con el propósito de identificar, si cumplen con la teoría de límite central, y descartar de que exista el fenómeno de heterocedasticidad condicionada, para ello necesariamente, hubo que aplicar los modelos Arima y Garch. Concluye el estudio, con la cuantificación de pérdidas esperadas, propuesto a través de un ajuste al cálculo de volatilidad, estimado a través de la Simulación de Montecarlo y de la incorporación de un riesgo adicional identificado en el rubro de activos líquidos y de aspectos que debería incorporar la normativa, a efectos de establecer umbrales de riesgo e indicadores que permitan realizar un monitoreo prudencial de los componentes de la liquidez de los bancos privados y establecer resguardos para épocas en decadencia.
Resumo:
We used ground surveys to identify breeding habitat for Whimbrel (Numenius phaeopus) in the outer Mackenzie Delta, Northwest Territories, and to test the value of high-resolution IKONOS imagery for mapping additional breeding habitat in the Delta. During ground surveys, we found Whimbrel nests (n = 28) in extensive areas of wet-sedge low-centered polygon (LCP) habitat on two islands in the Delta (Taglu and Fish islands) in 2006 and 2007. Supervised classification using spectral analysis of IKONOS imagery successfully identified additional areas of wet-sedge habitat in the region. However, ground surveys to test this classification found that many areas of wet-sedge habitat had dense shrubs, no standing water, and/or lacked polygon structure and did not support breeding Whimbrel. Visual examination of the IKONOS imagery was necessary to determine which areas exhibited LCP structure. Much lower densities of nesting Whimbrel were also found in upland habitats near wetlands. We used habitat maps developed from a combination of methods, to perform scenario analyses to estimate the potential effects of the Mackenzie Gas Project on Whimbrel habitat. Assuming effective complete habitat loss within 20 m, 50 m, or 250 m of any infrastructure or pipeline, the currently proposed pipeline development would result in loss of 8%, 12%, or 30% of existing Whimbrel habitat. If subsidence were to occur, most Whimbrel habitat could become unsuitable. If the facility is developed, follow-up surveys will be required to test these models.
Resumo:
Stable isotope labeling combined with MS is a powerful method for measuring relative protein abundances, for instance, by differential metabolic labeling of some or all amino acids with 14N and 15N in cell culture or hydroponic media. These and most other types of quantitative proteomics experiments using high-throughput technologies, such as LC-MS/MS, generate large amounts of raw MS data. This data needs to be processed efficiently and automatically, from the mass spectrometer to statistically evaluated protein identifications and abundance ratios. This paper describes in detail an approach to the automated analysis of uniformly 14N/15N-labeled proteins using MASCOT peptide identification in conjunction with the trans-proteomic pipeline (TPP) and a few scripts to integrate the analysis workflow. Two large proteomic datasets from uniformly labeled Arabidopsis thaliana were used to illustrate the analysis pipeline. The pipeline can be fully automated and uses only common or freely available software.
Resumo:
Stable isotope labeling combined with MS is a powerful method for measuring relative protein abundances, for instance, by differential metabolic labeling of some or all amino acids with N-14 and N-15 in cell culture or hydroponic media. These and most other types of quantitative proteomics experiments using high-throughput technologies, such as LC-MS/MS, generate large amounts of raw MS data. This data needs to be processed efficiently and automatically, from the mass spectrometer to statistically evaluated protein identifications and abundance ratios. This paper describes in detail an approach to the automated analysis of Uniformly N-14/N-15-labeled proteins using MASCOT peptide identification in conjunction with the trans-proteomic pipeline (TPP) and a few scripts to integrate the analysis workflow. Two large proteomic datasets from uniformly labeled Arabidopsis thaliana were used to illustrate the analysis pipeline. The pipeline can be fully automated and uses only common or freely available software.
Resumo:
Background: Expression microarrays are increasingly used to obtain large scale transcriptomic information on a wide range of biological samples. Nevertheless, there is still much debate on the best ways to process data, to design experiments and analyse the output. Furthermore, many of the more sophisticated mathematical approaches to data analysis in the literature remain inaccessible to much of the biological research community. In this study we examine ways of extracting and analysing a large data set obtained using the Agilent long oligonucleotide transcriptomics platform, applied to a set of human macrophage and dendritic cell samples. Results: We describe and validate a series of data extraction, transformation and normalisation steps which are implemented via a new R function. Analysis of replicate normalised reference data demonstrate that intrarray variability is small (only around 2 of the mean log signal), while interarray variability from replicate array measurements has a standard deviation (SD) of around 0.5 log(2) units (6 of mean). The common practise of working with ratios of Cy5/Cy3 signal offers little further improvement in terms of reducing error. Comparison to expression data obtained using Arabidopsis samples demonstrates that the large number of genes in each sample showing a low level of transcription reflect the real complexity of the cellular transcriptome. Multidimensional scaling is used to show that the processed data identifies an underlying structure which reflect some of the key biological variables which define the data set. This structure is robust, allowing reliable comparison of samples collected over a number of years and collected by a variety of operators. Conclusions: This study outlines a robust and easily implemented pipeline for extracting, transforming normalising and visualising transcriptomic array data from Agilent expression platform. The analysis is used to obtain quantitative estimates of the SD arising from experimental (non biological) intra- and interarray variability, and for a lower threshold for determining whether an individual gene is expressed. The study provides a reliable basis for further more extensive studies of the systems biology of eukaryotic cells.
Resumo:
We describe infinitely scalable pipeline machines with perfect parallelism, in the sense that every instruction of an inline program is executed, on successive data, on every clock tick. Programs with shared data effectively execute in less than a clock tick. We show that pipeline machines are faster than single or multi-core, von Neumann machines for sufficiently many program runs of a sufficiently time consuming program. Our pipeline machines exploit the totality of transreal arithmetic and the known waiting time of statically compiled programs to deliver the interesting property that they need no hardware or software exception handling.
Resumo:
Homology-driven proteomics is a major tool to characterize proteomes of organisms with unsequenced genomes. This paper addresses practical aspects of automated homology-driven protein identifications by LC-MS/MS on a hybrid LTQ orbitrap mass spectrometer. All essential software elements supporting the presented pipeline are either hosted at the publicly accessible web server, or are available for free download. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
A Atenção Humanizada ao Recém Nascido de baixo Peso- Método Mãe Canguru tem por finalidade o contato pele a pele precoce entre a mãe e o filho, fortalecendo o vínculo entre eles. Visto a importância deste método, decidiu-se realizar um estudo de natureza qualitativa com mães que realizaram dentro de uma Unidade de Terapia Intensiva o método mãe canguru com seu recém nascido de baixo peso, com o objetivo de descrever a experiência, a vivência e as dificuldades de se realizar este método em uma situação de hospitalização. Os dados foram analisados de acordo com a Teoria Fundamentada nos Dados, e emergiram sete categorias: Vivenciando o Sofrimento; Sendo apoiada e orientada a estreitar o contato com o filho; Dedicando-se ao filho; Sentindo-se mãe; Acompanhando o progresso do bebê; Percebendo a reciprocidade afetuosa do filho; Estreitando outros laços afetivos. Estas categorias revelam alguns aspectos de como se dá esta experiência da mãe com o recém nascido e revelam que o método mãe canguru contribui não só na formação de vínculo entre a díade mãe-filho, como também com entes próximo a ela e, além disso, na melhora do quadro clínico do RN de baixo peso.