994 resultados para ATE estimator
Resumo:
Mestrado em Contabilidade e Análise Financeira
Resumo:
Consolidation consists in scheduling multiple virtual machines onto fewer servers in order to improve resource utilization and to reduce operational costs due to power consumption. However, virtualization technologies do not offer performance isolation, causing applications’ slowdown. In this work, we propose a performance enforcing mechanism, composed of a slowdown estimator, and a interference- and power-aware scheduling algorithm. The slowdown estimator determines, based on noisy slowdown data samples obtained from state-of-the-art slowdown meters, if tasks will complete within their deadlines, invoking the scheduling algorithm if needed. When invoked, the scheduling algorithm builds performance and power aware virtual clusters to successfully execute the tasks. We conduct simulations injecting synthetic jobs which characteristics follow the last version of the Google Cloud tracelogs. The results indicate that our strategy can be efficiently integrated with state-of-the-art slowdown meters to fulfil contracted SLAs in real-world environments, while reducing operational costs in about 12%.
Resumo:
Trabalho de Projecto para obtenção do grau de Mestre em Engenharia Informática e de Computadores
Resumo:
The development of high spatial resolution airborne and spaceborne sensors has improved the capability of ground-based data collection in the fields of agriculture, geography, geology, mineral identification, detection [2, 3], and classification [4–8]. The signal read by the sensor from a given spatial element of resolution and at a given spectral band is a mixing of components originated by the constituent substances, termed endmembers, located at that element of resolution. This chapter addresses hyperspectral unmixing, which is the decomposition of the pixel spectra into a collection of constituent spectra, or spectral signatures, and their corresponding fractional abundances indicating the proportion of each endmember present in the pixel [9, 10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. The linear mixing model holds when the mixing scale is macroscopic [13]. The nonlinear model holds when the mixing scale is microscopic (i.e., intimate mixtures) [14, 15]. The linear model assumes negligible interaction among distinct endmembers [16, 17]. The nonlinear model assumes that incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [18]. Under the linear mixing model and assuming that the number of endmembers and their spectral signatures are known, hyperspectral unmixing is a linear problem, which can be addressed, for example, under the maximum likelihood setup [19], the constrained least-squares approach [20], the spectral signature matching [21], the spectral angle mapper [22], and the subspace projection methods [20, 23, 24]. Orthogonal subspace projection [23] reduces the data dimensionality, suppresses undesired spectral signatures, and detects the presence of a spectral signature of interest. The basic concept is to project each pixel onto a subspace that is orthogonal to the undesired signatures. As shown in Settle [19], the orthogonal subspace projection technique is equivalent to the maximum likelihood estimator. This projection technique was extended by three unconstrained least-squares approaches [24] (signature space orthogonal projection, oblique subspace projection, target signature space orthogonal projection). Other works using maximum a posteriori probability (MAP) framework [25] and projection pursuit [26, 27] have also been applied to hyperspectral data. In most cases the number of endmembers and their signatures are not known. Independent component analysis (ICA) is an unsupervised source separation process that has been applied with success to blind source separation, to feature extraction, and to unsupervised recognition [28, 29]. ICA consists in finding a linear decomposition of observed data yielding statistically independent components. Given that hyperspectral data are, in given circumstances, linear mixtures, ICA comes to mind as a possible tool to unmix this class of data. In fact, the application of ICA to hyperspectral data has been proposed in reference 30, where endmember signatures are treated as sources and the mixing matrix is composed by the abundance fractions, and in references 9, 25, and 31–38, where sources are the abundance fractions of each endmember. In the first approach, we face two problems: (1) The number of samples are limited to the number of channels and (2) the process of pixel selection, playing the role of mixed sources, is not straightforward. In the second approach, ICA is based on the assumption of mutually independent sources, which is not the case of hyperspectral data, since the sum of the abundance fractions is constant, implying dependence among abundances. This dependence compromises ICA applicability to hyperspectral images. In addition, hyperspectral data are immersed in noise, which degrades the ICA performance. IFA [39] was introduced as a method for recovering independent hidden sources from their observed noisy mixtures. IFA implements two steps. First, source densities and noise covariance are estimated from the observed data by maximum likelihood. Second, sources are reconstructed by an optimal nonlinear estimator. Although IFA is a well-suited technique to unmix independent sources under noisy observations, the dependence among abundance fractions in hyperspectral imagery compromises, as in the ICA case, the IFA performance. Considering the linear mixing model, hyperspectral observations are in a simplex whose vertices correspond to the endmembers. Several approaches [40–43] have exploited this geometric feature of hyperspectral mixtures [42]. Minimum volume transform (MVT) algorithm [43] determines the simplex of minimum volume containing the data. The MVT-type approaches are complex from the computational point of view. Usually, these algorithms first find the convex hull defined by the observed data and then fit a minimum volume simplex to it. Aiming at a lower computational complexity, some algorithms such as the vertex component analysis (VCA) [44], the pixel purity index (PPI) [42], and the N-FINDR [45] still find the minimum volume simplex containing the data cloud, but they assume the presence in the data of at least one pure pixel of each endmember. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. Hyperspectral sensors collects spatial images over many narrow contiguous bands, yielding large amounts of data. For this reason, very often, the processing of hyperspectral data, included unmixing, is preceded by a dimensionality reduction step to reduce computational complexity and to improve the signal-to-noise ratio (SNR). Principal component analysis (PCA) [46], maximum noise fraction (MNF) [47], and singular value decomposition (SVD) [48] are three well-known projection techniques widely used in remote sensing in general and in unmixing in particular. The newly introduced method [49] exploits the structure of hyperspectral mixtures, namely the fact that spectral vectors are nonnegative. The computational complexity associated with these techniques is an obstacle to real-time implementations. To overcome this problem, band selection [50] and non-statistical [51] algorithms have been introduced. This chapter addresses hyperspectral data source dependence and its impact on ICA and IFA performances. The study consider simulated and real data and is based on mutual information minimization. Hyperspectral observations are described by a generative model. This model takes into account the degradation mechanisms normally found in hyperspectral applications—namely, signature variability [52–54], abundance constraints, topography modulation, and system noise. The computation of mutual information is based on fitting mixtures of Gaussians (MOG) to data. The MOG parameters (number of components, means, covariances, and weights) are inferred using the minimum description length (MDL) based algorithm [55]. We study the behavior of the mutual information as a function of the unmixing matrix. The conclusion is that the unmixing matrix minimizing the mutual information might be very far from the true one. Nevertheless, some abundance fractions might be well separated, mainly in the presence of strong signature variability, a large number of endmembers, and high SNR. We end this chapter by sketching a new methodology to blindly unmix hyperspectral data, where abundance fractions are modeled as a mixture of Dirichlet sources. This model enforces positivity and constant sum sources (full additivity) constraints. The mixing matrix is inferred by an expectation-maximization (EM)-type algorithm. This approach is in the vein of references 39 and 56, replacing independent sources represented by MOG with mixture of Dirichlet sources. Compared with the geometric-based approaches, the advantage of this model is that there is no need to have pure pixels in the observations. The chapter is organized as follows. Section 6.2 presents a spectral radiance model and formulates the spectral unmixing as a linear problem accounting for abundance constraints, signature variability, topography modulation, and system noise. Section 6.3 presents a brief resume of ICA and IFA algorithms. Section 6.4 illustrates the performance of IFA and of some well-known ICA algorithms with experimental data. Section 6.5 studies the ICA and IFA limitations in unmixing hyperspectral data. Section 6.6 presents results of ICA based on real data. Section 6.7 describes the new blind unmixing scheme and some illustrative examples. Section 6.8 concludes with some remarks.
Resumo:
A análise forense de documentos é uma das áreas das Ciências Forenses, responsável pela verificação da autenticidade dos documentos. Os documentos podem ser de diferentes tipos, sendo a moeda ou escrita manual as evidências forenses que mais frequentemente motivam a análise. A associação de novas tecnologias a este processo de análise permite uma melhor avaliação dessas evidências, tornando o processo mais célere. Esta tese baseia-se na análise forense de dois tipos de documentos - notas de euro e formulários preenchidos por escrita manual. Neste trabalho pretendeu-se desenvolver técnicas de processamento e análise de imagens de evidências dos tipos referidos com vista a extração de medidas que permitam aferir da autenticidade dos mesmos. A aquisição das imagens das notas foi realizada por imagiologia espetral, tendo-se definidas quatro modalidades de aquisição: luz visível transmitida, luz visível refletida, ultravioleta A e ultravioleta C. Para cada uma destas modalidades de aquisição, foram também definidos 2 protocolos: frente e verso. A aquisição das imagens dos documentos escritos manualmente efetuou-se através da digitalização dos mesmos com recurso a um digitalizador automático de um aparelho multifunções. Para as imagens das notas desenvolveram-se vários algoritmos de processamento e análise de imagem, específicos para este tipo de evidências. Esses algoritmos permitem a segmentação da região de interesse da imagem, a segmentação das sub-regiões que contém as marcas de segurança a avaliar bem como da extração de algumas características. Relativamente as imagens dos documentos escritos manualmente, foram também desenvolvidos algoritmos de segmentação que permitem obter todas as sub-regiões de interesse dos formulários, de forma a serem analisados os vários elementos. Neste tipo de evidências, desenvolveu-se ainda um algoritmo de análise para os elementos correspondentes à escrita de uma sequência numérica o qual permite a obtenção das imagens correspondentes aos caracteres individuais. O trabalho desenvolvido e os resultados obtidos permitiram a definição de protocolos de aquisição de imagens destes tipos de evidências. Os algoritmos automáticos de segmentação e análise desenvolvidos ao longo deste trabalho podem ser auxiliares preciosos no processo de análise da autenticidade dos documentos, o qual, ate então, é feito manualmente. Apresentam-se ainda os resultados dos estudos feitos às diversas evidências, nomeadamente as performances dos diversos algoritmos analisados, bem como algumas das adversidades encontradas durante o processo. Apresenta-se também uma discussão da metodologia adotada e dos resultados, bem como de propostas de continuação deste trabalho, nomeadamente, a extração de características e a implementação de classificadores capazes aferir da autenticidade dos documentos.
Resumo:
Submitted in partial fulfillment for the Requirements for the Degree of PhD in Mathematics, in the Speciality of Statistics in the Faculdade de Ciências e Tecnologia
Resumo:
Radio link quality estimation is essential for protocols and mechanisms such as routing, mobility management and localization, particularly for low-power wireless networks such as wireless sensor networks. Commodity Link Quality Estimators (LQEs), e.g. PRR, RNP, ETX, four-bit and RSSI, can only provide a partial characterization of links as they ignore several link properties such as channel quality and stability. In this paper, we propose F-LQE (Fuzzy Link Quality Estimator, a holistic metric that estimates link quality on the basis of four link quality properties—packet delivery, asymmetry, stability, and channel quality—that are expressed and combined using Fuzzy Logic. We demonstrate through an extensive experimental analysis that F-LQE is more reliable than existing estimators (e.g., PRR, WMEWMA, ETX, RNP, and four-bit) as it provides a finer grain link classification. It is also more stable as it has lower coefficient of variation of link estimates. Importantly, we evaluate the impact of F-LQE on the performance of tree routing, specifically the CTP (Collection Tree Protocol). For this purpose, we adapted F-LQE to build a new routing metric for CTP, which we dubbed as F-LQE/RM. Extensive experimental results obtained with state-of-the-art widely used test-beds show that F-LQE/RM improves significantly CTP routing performance over four-bit (the default LQE of CTP) and ETX (another popular LQE). F-LQE/RM improves the end-to-end packet delivery by up to 16%, reduces the number of packet retransmissions by up to 32%, reduces the Hop count by up to 4%, and improves the topology stability by up to 47%.
Resumo:
In this paper we introduce a formation control loop that maximizes the performance of the cooperative perception of a tracked target by a team of mobile robots, while maintaining the team in formation, with a dynamically adjustable geometry which is a function of the quality of the target perception by the team. In the formation control loop, the controller module is a distributed non-linear model predictive controller and the estimator module fuses local estimates of the target state, obtained by a particle filter at each robot. The two modules and their integration are described in detail, including a real-time database associated to a wireless communication protocol that facilitates the exchange of state data while reducing collisions among team members. Simulation and real robot results for indoor and outdoor teams of different robots are presented. The results highlight how our method successfully enables a team of homogeneous robots to minimize the total uncertainty of the tracked target cooperative estimate while complying with performance criteria such as keeping a pre-set distance between the teammates and the target, avoiding collisions with teammates and/or surrounding obstacles.
Resumo:
We investigate whether the positive relation between accounting accruals and information asymmetry documented for U.S. stock markets also holds for European markets, considered as a whole and at the country level. This research is relevant because this relation is likely to be affected by differences in accounting standards used by companies for financial reporting, in the traditional use of the banking system or capital markets for firm financing, in legal systems and cultural environment. We find that in European stock markets discretionary accruals are positively related with the Corwin and Schultz high-low spread estimator used as a proxy for information asymmetry. Our results suggest that the earnings management component of accruals outweighs the informational component, but the significance of the relation varies across countries. Further, such association tends to be stronger for firms with the highest levels of positive discretionary accruals. Consistent with the evidence provided by the authors, our results also suggest that the high-low spread estimator is more efficient than the closing bid-ask spread when analysing the impact of information quality on information asymmetry.
Resumo:
Introdução: Os hemangiomas constituem a neoplasia mais frequente na criança, ocorrendo em 10-12%, na maioria dos casos com evolução favorável. A fase proliferativa, ocorre nos primeiros 4-6 meses e depois involuem em 50% dos casos, até aos 5 anos. Em hemangiomas de grandes dimensões e que interferem na função de outros órgãos, associam-se frequentemente complicações, nomeadamente a ulceração (10-15%), sobre-infecção bacteriana ou hemorragia. Descrição de Caso Clínico: Criança do sexo feminino, de 6 meses, com hemangioma de grandes dimensões, que ocupava todo o ombro, que nos dois meses prévios realizava regularmente tratamento com laser, internada por ulceração e infecção cutânea. Leucócitos 12.300/μL, neutrófilos 38,9%, plaquetas 616.000/μL e PCR 6,7 mg/dL. Foi medicada empiricamente com ceftazidima, flucloxacilina e gentamicina e ficando em curso cultura do exsudado em que posteiormente se isolou Staphylococcus aureus meticilino-sensível e Pseudomonas aeruginosa. A referir ainda anemia ferropenica grave com hemoglobina 5,2 g/dL, hematócrito 15,8% e siderémia (20 μg/dL) com necessidade de transfusão de concentrado eritrocitário e posteriormente terapêutica marcial. A ecografia abdominal revelou pequeno hemangioma hepático e a ecografia trans-fontanelar não tinha alterações. Após realização de electrocardiograma, iniciou terapêutica com propanolol na dose inicial de 0,15 mg/kg/dia com aumento gradual ate 1,5 mg/kg/dia com melhoria clínica e diminuição das dimensões e coloração do hemangioma e sem efeitos secundários a registar. Actualmente mantém terapêutica com propanolol e ferro oral, com o último valor de hemoglobina de 10,6 g/dL Conclusão: A terapêutica do hemamgioma inclui a utilização de laser, a embolização ou a excisão cirúrgica. Neste caso o tratamento convencional não resultou. O propanolol como uma nova alternativa terapêutica tem vindo a assumir uma importância crescente, na melhoria clínica destas situações. A realização de exames complementares para vigiar eventuais efeitos secundários é mandatória e a utilização de doses crescentes aumenta o perfil de segurança desta terapêutica.
Resumo:
Cloud data centers have been progressively adopted in different scenarios, as reflected in the execution of heterogeneous applications with diverse workloads and diverse quality of service (QoS) requirements. Virtual machine (VM) technology eases resource management in physical servers and helps cloud providers achieve goals such as optimization of energy consumption. However, the performance of an application running inside a VM is not guaranteed due to the interference among co-hosted workloads sharing the same physical resources. Moreover, the different types of co-hosted applications with diverse QoS requirements as well as the dynamic behavior of the cloud makes efficient provisioning of resources even more difficult and a challenging problem in cloud data centers. In this paper, we address the problem of resource allocation within a data center that runs different types of application workloads, particularly CPU- and network-intensive applications. To address these challenges, we propose an interference- and power-aware management mechanism that combines a performance deviation estimator and a scheduling algorithm to guide the resource allocation in virtualized environments. We conduct simulations by injecting synthetic workloads whose characteristics follow the last version of the Google Cloud tracelogs. The results indicate that our performance-enforcing strategy is able to fulfill contracted SLAs of real-world environments while reducing energy costs by as much as 21%.
Resumo:
O dinamismo e capacidade de actualização de uma equipa médica traduzem-se na actividade científica de uma unidade de saúde. Essa actividade constitui por isso um meio de avaliar o seu desempenho. Objectivo: Foi nosso objectivo avaliar a actividade científica médica desenvolvida na unidade de cuidados intensivos neonatais (UCIN) do Hospital de Dona Estefania desde a abertura, em Abril de 1983, ate a comemoração dos seus 20 anos, em Abril de 2003. Material e Métodos: Os dados foram coligidos a partir dos curricula dos elementos da equipa, dos arquivos da unidade, da publicação do Anuário entre 1993 e 2001 e da memória de alguns elementos. Foram englobados os seguintes itens: presidência e moderação de conferências, palestras e mesas redondas; conferências, palestras e comunicações livres; trabalhos publicados; participação em estudos mu1ticêntricos nacionais e internacionais; estudos prospectivos desenvolvidos pela própria unidade; trabalhos indexados em Medline e citações; colaboração em teses de doutoramento e mestrado; participação em conselhos científicos, editorais ou redactoriais de revistas cientificas; e, finalmente, a actividade desenvolvida no âmbito de corpos directivos de sociedades científicas. Excluiram-se as apresentações em reuniões da unidade, do serviço ou outras reuniões de âmbito estritamente hospitalar e as palestras proferidas no âmbito do ensino pré-graduado. O número de médicos/ano foi calculado com base no número de anos durante o qual cada elemento integrou a equipa, o número total de elementos que dela já fizeram parte e os 20 anos da unidade. Resultados: A média do número de medicos na UCIN foi de 9 por ano. Contabilizaram-se 123 moderações de mesa - 98 nacionais e 25 internacionais (média de 6 por ano; O,7/médico/ano); 487 conferências, palestras e comunicações 1ivres - 368 nacionais e 119 internacionais (média de 25 intervenções/ano; 2,7 intervenções/médico/ano); 221 publicações (média 11 publicações/ano: 1,2 trabalhos/médico/ano). Os anos de encerramento da maternidade são os de menor número de comunicações livres. A Unidade participou em 20 trabalhos prospectivos nacionais, em 14 estudos multicêntricos, dos quais 5 internacionais, e em 5 teses de mestrado ou doutoramento. Onze trabalhos estão indexados em Medline, encontraram-se 21 citações e foram atribuidos prémios a 23 trabalhos. Houve 10 participações em corpos directivos de sociedades cientificas, 1 das quais internacional; 15 participações em corpos redactoriais e editoriais de revistas científicas, 3 das quais internacionais e organização de 64 reuniões cientificas, 5 das quais internacionais. Discussão: Não havendo termo de comparação: é difícil dizer se a actividade da UCIN foi aceitável. Apesar do esforço que sabemos ter sido desenvolvido e da preocupação que sempre orientou os chefes, a revisão parece somar pouco trabalho, nomeadamente no que respeita a publicações. A UCIN deve melhorar e deve induzir a melhoria do hospital. Por isso se fazem algumas propostas: estágios de internos em serviços idóneos com os quais se estabeleçaa intercâmbio cientifico; publicação em revistas indexadas; participação em estudos multicêntricos nacionais e internacionais; maior recurso a bolsas de investigação. A promoção da investigação passa pela definição de objectivos por períodos definidos, por grupos profissionais e por áreas de interesse, pela avaliação da concretização desses objectivos e pelo envolvimemo activo da instituição. Para isso será fundamental o papel do Departamento de Investigação em Pediatria recentemente criado.
Resumo:
Dissertação de Mestrado em História da Arte Moderna Departamento de História da Arte
Resumo:
Durability of Building Materials and Components (Vasco Peixoto de de Freitas, J.M.P.Q. Delgado, eds.), Building Pathology and Rehabilitation, vol. 3, VIII, 105-126. ISBN: 978-3-642-37474-6 (Print) 978-3-642-37475-3 (Online). Springer-Verlag Berlin Heidelberg. DOI: 10.1007/978-3-642-37475-3_5
Resumo:
Dissertação para obtenção do Grau de Doutora em Estatística e Gestão de Risco, Especialidade em Estatística