993 resultados para Large Extra Dimensions


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measuring the quality of a b-learning environment is critical to determine the success of a b-learning course. Several initiatives have been recently conducted on benchmarking and quality in e-learning. Despite these efforts in defining and examining quality issues concerning online courses, a defining instrument to evaluate quality is one of the key challenges for blended learning, since it incorporates both traditional and online instruction methods. For this paper, six frameworks for quality assessment of technological enhanced learning were examined and compared regarding similarities and differences. These frameworks aim at the same global objective: the quality of e-learning environment/products. They present different perspectives but also many common issues. Some of them are more specific and related to the course and other are more global and related to institutional aspects. In this work we collected and arrange all the quality criteria identified in order to get a more complete framework and determine if it fits our b-learning environment. We also included elements related to our own b-learning research and experience, acquired during more than 10 years of experience. As a result we have create a new quality reference with a set of dimensions and criteria that should be taken into account when you are analyzing, designing, developing, implementing and evaluating a b-learning environment. Besides these perspectives on what to do when you are developing a b-learning environment we have also included pedagogical issues in order to give directions on how to do it to reach the success of the learning. The information, concepts and procedures here presented give support to teachers and instructors, which intend to validate the quality of their blended learning courses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hyperspectral remote sensing exploits the electromagnetic scattering patterns of the different materials at specific wavelengths [2, 3]. Hyperspectral sensors have been developed to sample the scattered portion of the electromagnetic spectrum extending from the visible region through the near-infrared and mid-infrared, in hundreds of narrow contiguous bands [4, 5]. The number and variety of potential civilian and military applications of hyperspectral remote sensing is enormous [6, 7]. Very often, the resolution cell corresponding to a single pixel in an image contains several substances (endmembers) [4]. In this situation, the scattered energy is a mixing of the endmember spectra. A challenging task underlying many hyperspectral imagery applications is then decomposing a mixed pixel into a collection of reflectance spectra, called endmember signatures, and the corresponding abundance fractions [8–10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. Linear mixing model holds approximately when the mixing scale is macroscopic [13] and there is negligible interaction among distinct endmembers [3, 14]. If, however, the mixing scale is microscopic (or intimate mixtures) [15, 16] and the incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [17], the linear model is no longer accurate. Linear spectral unmixing has been intensively researched in the last years [9, 10, 12, 18–21]. It considers that a mixed pixel is a linear combination of endmember signatures weighted by the correspondent abundance fractions. Under this model, and assuming that the number of substances and their reflectance spectra are known, hyperspectral unmixing is a linear problem for which many solutions have been proposed (e.g., maximum likelihood estimation [8], spectral signature matching [22], spectral angle mapper [23], subspace projection methods [24,25], and constrained least squares [26]). In most cases, the number of substances and their reflectances are not known and, then, hyperspectral unmixing falls into the class of blind source separation problems [27]. Independent component analysis (ICA) has recently been proposed as a tool to blindly unmix hyperspectral data [28–31]. ICA is based on the assumption of mutually independent sources (abundance fractions), which is not the case of hyperspectral data, since the sum of abundance fractions is constant, implying statistical dependence among them. This dependence compromises ICA applicability to hyperspectral images as shown in Refs. [21, 32]. In fact, ICA finds the endmember signatures by multiplying the spectral vectors with an unmixing matrix, which minimizes the mutual information among sources. If sources are independent, ICA provides the correct unmixing, since the minimum of the mutual information is obtained only when sources are independent. This is no longer true for dependent abundance fractions. Nevertheless, some endmembers may be approximately unmixed. These aspects are addressed in Ref. [33]. Under the linear mixing model, the observations from a scene are in a simplex whose vertices correspond to the endmembers. Several approaches [34–36] have exploited this geometric feature of hyperspectral mixtures [35]. Minimum volume transform (MVT) algorithm [36] determines the simplex of minimum volume containing the data. The method presented in Ref. [37] is also of MVT type but, by introducing the notion of bundles, it takes into account the endmember variability usually present in hyperspectral mixtures. The MVT type approaches are complex from the computational point of view. Usually, these algorithms find in the first place the convex hull defined by the observed data and then fit a minimum volume simplex to it. For example, the gift wrapping algorithm [38] computes the convex hull of n data points in a d-dimensional space with a computational complexity of O(nbd=2cþ1), where bxc is the highest integer lower or equal than x and n is the number of samples. The complexity of the method presented in Ref. [37] is even higher, since the temperature of the simulated annealing algorithm used shall follow a log( ) law [39] to assure convergence (in probability) to the desired solution. Aiming at a lower computational complexity, some algorithms such as the pixel purity index (PPI) [35] and the N-FINDR [40] still find the minimum volume simplex containing the data cloud, but they assume the presence of at least one pure pixel of each endmember in the data. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. PPI algorithm uses the minimum noise fraction (MNF) [41] as a preprocessing step to reduce dimensionality and to improve the signal-to-noise ratio (SNR). The algorithm then projects every spectral vector onto skewers (large number of random vectors) [35, 42,43]. The points corresponding to extremes, for each skewer direction, are stored. A cumulative account records the number of times each pixel (i.e., a given spectral vector) is found to be an extreme. The pixels with the highest scores are the purest ones. N-FINDR algorithm [40] is based on the fact that in p spectral dimensions, the p-volume defined by a simplex formed by the purest pixels is larger than any other volume defined by any other combination of pixels. This algorithm finds the set of pixels defining the largest volume by inflating a simplex inside the data. ORA SIS [44, 45] is a hyperspectral framework developed by the U.S. Naval Research Laboratory consisting of several algorithms organized in six modules: exemplar selector, adaptative learner, demixer, knowledge base or spectral library, and spatial postrocessor. The first step consists in flat-fielding the spectra. Next, the exemplar selection module is used to select spectral vectors that best represent the smaller convex cone containing the data. The other pixels are rejected when the spectral angle distance (SAD) is less than a given thresh old. The procedure finds the basis for a subspace of a lower dimension using a modified Gram–Schmidt orthogonalizati on. The selected vectors are then projected onto this subspace and a simplex is found by an MV T pro cess. ORA SIS is oriented to real-time target detection from uncrewed air vehicles using hyperspectral data [46]. In this chapter we develop a new algorithm to unmix linear mixtures of endmember spectra. First, the algorithm determines the number of endmembers and the signal subspace using a newly developed concept [47, 48]. Second, the algorithm extracts the most pure pixels present in the data. Unlike other methods, this algorithm is completely automatic and unsupervised. To estimate the number of endmembers and the signal subspace in hyperspectral linear mixtures, the proposed scheme begins by estimating sign al and noise correlation matrices. The latter is based on multiple regression theory. The signal subspace is then identified by selectin g the set of signal eigenvalue s that best represents the data, in the least-square sense [48,49 ], we note, however, that VCA works with projected and with unprojected data. The extraction of the end members exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. As PPI and N-FIND R algorithms, VCA also assumes the presence of pure pixels in the data. The algorithm iteratively projects data on to a direction orthogonal to the subspace spanned by the endmembers already determined. The new end member signature corresponds to the extreme of the projection. The algorithm iterates until all end members are exhausted. VCA performs much better than PPI and better than or comparable to N-FI NDR; yet it has a computational complexity between on e and two orders of magnitude lower than N-FINDR. The chapter is structure d as follows. Section 19.2 describes the fundamentals of the proposed method. Section 19.3 and Section 19.4 evaluate the proposed algorithm using simulated and real data, respectively. Section 19.5 presents some concluding remarks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Em Portugal existem muitos espaços comerciais e industriais em que as necessidades térmicas de arrefecimento são muito superiores às necessidades de aquecimento devido aos ganhos internos que advêm da existência de equipamentos e da iluminação dos edifícios, assim como, da presença das pessoas. A instalação de sistemas convencionais de ar condicionado para espaços comerciais e industriais de grande dimensão está geralmente associada ao transporte de grandes caudais de ar, e consequentemente, a elevados consumos de energia primária, e também, elevados custos de investimento, de manutenção e de operação. O arrefecedor evaporativo é uma solução de climatização com elevada eficiência energética, cujo princípio de funcionamento promove a redução do consumo de energia primária nos edifícios. A metodologia utilizada baseou-se na criação de uma ferramenta informática de simulação do funcionamento de um protótipo de um arrefecedor evaporativo. Foi efetuada a modelação matemática das variáveis dinâmicas envolvidas, dos processos de transferência de calor e de massa, assim como dos balanços de energia que ocorrem no arrefecedor evaporativo. A ferramenta informática desenvolvida permite o dimensionamento do protótipo do arrefecedor evaporativo, sendo determinadas as caraterísticas técnicas (potência térmica, caudal, eficiência energética, consumo energético e consumo e água) de acordo com o tipo de edifício e com as condições climatéricas do ar exterior. Foram selecionados três dimensionamentos de arrefecedores evaporativos, representativos de condições reais de uma gama baixa, média e elevada de caudais de ar. Os resultados obtidos nas simulações mostram que a potência de arrefecimento (5,6 kW, 16,0 kW e 32,8 kW) e o consumo de água (8 l/h, 23,9 l/h e 48,96 l/h) aumentam com o caudal de ar do arrefecedor, 5.000 m3/h, 15.000 m3/h e 30.000 m3/h, respetivamente. A eficácia de permuta destes arrefecedores evaporativos, foi de 69%, 66% e 67%, respetivamente. Verificou-se que a alteração de zona climática de V1 para V2 implicou um aumento de 39% na potência de arrefecimento e de 20% no consumo de água, e que, a alteração de zona climática de V2 para V3 implicou um aumento de 39% na potência de arrefecimento e de 39% no consumo de água. O arrefecedor evaporativo apresenta valores de consumo de energia elétrica entre 40% a 80% inferiores aos dos sistemas de arrefecimento convencionais, sendo este efeito mais intenso quando a zona climática de verão se torna mais severa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

RESUMO: Os carcinomas localizados no nariz são muito frequentes em todas as séries conhecidas. São de diagnóstico clínico fácil e a sua confirmação por biópsia é muito segura. As terapêuticas mais indicadas são a cirurgia e a radioterapia, genericamente eficazes. Verifica-se, no entanto, que os pacientes continuam a solicitar tratamento em estádios muito avançados, mesmo conhecendo o diagnóstico e tendo acesso aos serviços sem custos. Esta situação poderá explicar-se face ao curso relativamente lento de muitos destes tumores e à idade geralmente avançada dos doentes que, de acordo com alguns inquéritos, receiam mais a terapeûtica do que a doença. Para obtenção de informação útil para condução deste problema, foram ainda analisados outros parâmetros. A maioria dos pacientes continua a solicitar tratamento quando as lesões envolvem duas subunidades nasais. Esta circunstância permite planear o tratamento cirúrgico com relativa facilidade, isto é, com exérese e reconstrução cujo resultado estético final é bastante aceitável. Os tumores de grandes dimensões, envolvendo várias subunidades, sendo frequentes, raramente implicam rinectomia total. Pelo contrário, são mais frequentes os tumores que envolvem metade do nariz e as estruturas vizinhas tais como o maxilar, a órbita e o lábio superior, atingindo mesmo a base do crânio. O controlo da doença nestes estádios é muito difícil. Não raramente, quando se crê que a doença está controlada, a cirurgia reconstrutiva bem como outras formas de reabilitação conjugadas, deixam ainda muita insatisfação. A nossa actividade tem-se desenvolvido seguindo os critérios adoptados nos melhores centros, isto é, as técnicas clássicas, complementadas com refinamentos recentes. Porém reflectindo sobre os resultados obtidos no tratamento de tumores do nariz, surge-nos um conjunto de questões para as quais ainda não encontrámos respostas cabais. Actuando de acordo com os princípios que definem o estado da arte, não obtivemos ainda resultados que satisfaçam tanto os doentes quanto os cirurgiões. Incessantemente procuramos novos dados técnicos e científicos que nos permitam sair deste ciclo vicioso em que o doente retarda a procura de assistência, receoso de que a terapêutica o deixe desfigurado. Tendo sempre em vista a obtenção dos melhores resultados com o mínimo de tempos cirúrgicos, valorizamos alguns detalhes praticados nos retalhos com padrão vascular bem definido. Dado que as sequelas na zona dadora de tecidos são uma incontornável preocupação, procuramos refinar a sua aplicação no sentido de as atenuarmos. A fronte, excelente zona dadora para reconstrucção nasal major, era sede de sequelas actualmente inaceitáveis. Estudado o comportamento dos tecidos na fronte, depois de levantado o retalho e efectuado o seu encerramento com uso da técnica de expansão intra-operatória, determinámos a presença do Factor de Crescimento Vascular Endotelial no próprio retalho e na zona dadora, tendo em vista que a sua presença poderá explicar o comportamento dos tecidos que foram submetidos a esta técnica. Procurou-se estudar a qualidade da reconstrução em 45 pacientes submetidos a cirurgia de exérese e reconstrução nasal major, assim como a qualidade de vida, relacionada com a doença e a terapêutica. Embora se possa admitir a existência de dados sugestivos de estratégias mais adequadas, não foi possível relacionar a qualidade da reconstrução com qualidade de vida dos pacientes. Poderá eventualmente concluir-se que a observação permanente da reconstrução, com qualidade estética e funcional, será o melhor método de alterar a ideia clássica, ainda muito divulgada, mas já ultrapassada, de que a cirurgia reconstrutiva do nariz não é mais que transformar um defeito horroroso num defeito ridículo.---------------ABSTRACT: Malignant tumours found in the nose are very frequent in all known series. Clinical diagnosis is simple and confirmation of biopsy diagnosis is accessible and safe. The most advisable therapies are surgery and radiotherapy. Despite everything patients continue to wait until the tumour is in an advanced stage before asking for therapy, although they know the diagnosis and have free access to specialised services. This situation could probably be explained by the slow development rate of the tumours which is associated with the age of the patient. Upon inquiry, it was found that a significant number of patients are more afraid of therapy than of the disease itself. Other parameters have been analysed in order to obtain useful information about the management of this problem. The majority of patients seek adequate treatment when the lesions involve two nasal subunits. This allows the programming of surgical therapy with relative ease as they may be removed and reconstructed with interesting final aesthetical results. Large tumours involving several subunits are frequent, but they rarely call for total rhinectomy. On the contrary, tumours more frequently involve half of the nose and their neighbouring structures: for example, maxillary, orbital and upper lip, even reaching as far as the base of the skull. The control of the disease is very difficult in these stages.In cases in which it is believed that the disease is under control, reconstructive surgery in conjunction with other forms of rehabilitation still result in a lot of dissatisfaction. In our activity we try to follow the criteria adopted by the best centres following classic techniques, complemented with recent refinements. Reflecting on the treatment of tumours of the nose has led us to a series of questions to which we haven’t yet found the answers. In accordance with the defined principles of ‘the state of the art’ it still doesn’t satisfy either the patients or the surgeons. We are looking for new technical and scientific data which allows us to leave this vicious cycle, in that the deferred patient avoids looking for assistance, based on the fear that therapy could leave them disfigured. We attach importance to some practiced details on the well-defined vascular pattern of the flaps, with the principle aim of obtaining a good result, from the minimum number of operations. It is known that sequels in donor sites are a concern, so applied refinements are used in order to reduce the defect. The forehead has been considered an excellent donor site for major nasal reconstruction but the area of sequel is nowadays unacceptable. We tried to study the behaviour of the tissues of the forehead after taking the flap and closing the wound, using the intraoperative expansion technique. We determined the presence of Vascular Endothelial Growth Factor in the flaps and in the donor site, in which its presence could explain the behaviour of the tissues of the forehead that are submitted to this technique. The quality of the reconstruction was studied in 45 patients who were submitted to surgical exeresisand major nasal reconstruction, as was the relationship between the disease and the therapy regarding quality of life. It was not possible to directely relate the quality of the reconstruction to the quality of patients life, although some suggestive data of more adequate manegement may be interesting. One might eventually conclude that, permanent exposure of the reconstruction with aesthetic and funcional quality would be the best method in order to modify the classic idea which is still known although overridden today, that nasal reconstruction could transform a horrible defect into a ridiculous one.-------RÉSUMÉ: Les carcinomes situés sur le nez sont très fréquents dans toutes les séries connues. Ils sont de diagnostic facile et la confirmation de ce dernier par une biopsie, est accessible et très fiable. La chirurgie et la radiothérapie sont les thérapeutiques les mieux indiquées. Toutefois les patients continuent de solliciter un traitement, seulement dans des états très avancés bien qu’ils aient eu connaissance du diagnostic et ayant accès aux services. Cette situation pourra probablement s’expliquer par l’évolution relativement indolente de beaucoup de tumeurs, associée à l’âge des malades; bien que selon quelques enquêtes réalisées un nombre élevé de malades craint davantage la thérapeutique que la maladie. D’autres paramètres sont analysés en vue d’obtenir des informations utiles pour l’accompagnement de ce problème. La majorité de nos patients sollicite le traitement adéquat quand les lésions entourent deux sous-unités nasales, ce qui permet de planifier le traitement chirurgique avec une certaine facilité, c’est à dire l’exérèse et la reconstruction ayant un résultat final esthétique généralement très acceptable. Les tumeurs de grandes dimensions entourant différentes sous-unités sont fréquentes mais elles impliquent rarement une amputation nasal total. Au contraire, les tumeurs les plus fréquentes sont celles qui entourent la moitié du nez et les structures voisines comme le maxillaire, l’orbite et la lèvre supérieure, parfois, elles peuvent même atteindre la base du crâne. Le contrôle de la maladie dans ces états est très difficile et quand nous pensons que la maladie est contrôlée, la chirurgie reconstructrice associée à d’autres formes de réhabilitation provoquent encore une grande insatisfaction. Nous exerçons notre activité en essayant de suivre les critères adoptés dans les meilleurs centres. Nous appliquons les techniques classiques complétées de retouches pour obtenir un meilleur resultat. Le fait de traiter les tumeurs nasales nous fait réfléchir et poser un ensemble de questions auxquelles nous n’avons pas pu trouver de réponses. En actuant en accord avec les principes qui définissent l’état de l’art, nous n’avons pas obtenu de résultats qui satisfassent les malades et les chirurgiens. Nous recherchons de nouvelles données techniques et scientifiques qui nous permettent de sortir de ce cercle vicieux dans lequel le patient retarde la recherche d’aide craignant que la thérapeutique le défigure. Nous valorisons certains détails pratiqués sur les lambeaux de patron vasculaire bien défini et ayant comme principaux objectifs l’obtention d’un bon résultat en moins de temps de chirurgie. Nous savons que les séquelles de la zone donneuse de tissus sont préoccupantes, ainsi, que les retouches qui ont été appliqués dans l’objectif de les atténuer. Le front, excellente zone donneuse pour la reconstruction nasale majeure, était une source de séquelle actuellement inacceptable. Nous avons étudié le comportement des tissus du front après avoir relevé le lambeau et effectué la fermeture avec la technique de l’expansion intraoperative. Nous avons déterminé la présence du Facteur de Croissance Vasculaire Endothéliale dans le propre lambeau et dans la zone donneuse, celle-ci pourra expliquer le comportement des tissus du front qui ont été soumis à cette technique. On a essayé d´etudier la qualité de la reconstruction sur 45 patients soumis à la chirurgie d´exérèse et la reconstruction nasal majeure, ainsi comme la qualité de vie en relation avec la maladie et la thérapie. Quoique l´on puisse conclure par l´existence des données subjectives des stratégies plus justes, il est impossible de faire un rapport sur la qualité de la reconstruction avec la qualité de vie des patients. Eventuellement l´on purrait conclure que l´observation permanente de la reconstruction avec qualité esthétique et fonctionnelle, se serait la meilleure méthod de changer l´idée classique, mais depassée, de que la rhinopoièse n´est pas que transformer un affreux défaut par un défaut ridicule.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The intensive use of distributed generation based on renewable resources increases the complexity of power systems management, particularly the short-term scheduling. Demand response, storage units and electric and plug-in hybrid vehicles also pose new challenges to the short-term scheduling. However, these distributed energy resources can contribute significantly to turn the shortterm scheduling more efficient and effective improving the power system reliability. This paper proposes a short-term scheduling methodology based on two distinct time horizons: hour-ahead scheduling, and real-time scheduling considering the point of view of one aggregator agent. In each scheduling process, it is necessary to update the generation and consumption operation, and the storage and electric vehicles status. Besides the new operation condition, more accurate forecast values of wind generation and consumption are available, for the resulting of short-term and very short-term methods. In this paper, the aggregator has the main goal of maximizing his profits while, fulfilling the established contracts with the aggregated and external players.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

WWW is a huge, open, heterogeneous system, however its contents data is mainly human oriented. The Semantic Web needs to assure that data is readable and “understandable” to intelligent software agents, though the use of explicit and formal semantics. Ontologies constitute a privileged artifact for capturing the semantic of the WWW data. Temporal and spatial dimensions are transversal to the generality of knowledge domains and therefore are fundamental for the reasoning process of software agents. Representing temporal/spatial evolution of concepts and their relations in OWL (W3C standard for ontologies) it is not straightforward. Although proposed several strategies to tackle this problem but there is still no formal and standard approach. This work main goal consists of development of methods/tools to support the engineering of temporal and spatial aspects in intelligent systems through the use of OWL ontologies. An existing method for ontology engineering, Fonte was used as framework for the development of this work. As main contributions of this work Fonte was re-engineered in order to: i) support the spatial dimension; ii) work with OWL Ontologies; iii) and support the application of Ontology Design Patterns. Finally, the capabilities of the proposed approach were demonstrated by engineering time and space in a demo ontology about football.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Os Transformadores de potência são máquinas de elevada importância ao nível dos Sistemas Elétricos de Energia (SEE) uma vez que são estas máquinas que possibilitam a interligação dos diferentes níveis de tensão da rede e a transmissão de energia elétrica em Corrente Alternada (CA). Geralmente, estas máquinas são de grandes dimensões e de elevado nível de complexidade construtiva. Caracterizam-se por possuírem períodos de vida útil bastante elevados (vinte a trinta anos) e preços elevados, o que conduz a um nível de exigência de fiabilidade muito elevada, uma vez que não e viável a existência de muitos equipamentos de reserva nos SEE. Com o objetivo de tentar maximizar o período de vida útil dos transformadores de potência e a sua fiabilidade, tenta-se, cada vez mais, implementar conceitos de manutenção preventiva a este tipo de máquinas. No entanto, a gestão da sua vida útil e extremamente complexa na medida em que, estas máquinas têm vários componentes cruciais e suscetiveis de originar falhas e, quase todos eles, encontram-se no interior de uma cuba. Desta forma, não e possível obter uma imagem do seu estado, em tempo real, sem colocar o transformador fora de serviço, algo que acarreta custos elevados. Por este motivo, desenvolveu-se uma técnica que permite obter uma indicação do estado do transformador, em tempo real, sem o retirar de serviço, colhendo amostras do óleo isolante e procedendo a sua análise físico-química e Analise Gases Dissolvidos (DGA). As análises aos óleos isolantes tem vindo a adquirir uma preponderância muito elevada no diagnóstico de falhas e na analise do estado de conservação destes equipamentos tendo-se desenvolvido regras para interpretação dos parâmetros dos óleos com carácter normativo. Considerando o conhecimento relativo a interpretação dos ensaios físico-químicos e DGA ao oleol, e possível desenvolver ferramentas capazes de otimizar essas mesmas interpretações e aplicar esse conhecimento no sentido de prever a sua evolução, assim como o surgimento de possíveis falhas em transformadores, para assim otimizar os processos de manutenção. Neste campo as Redes Neuronais Artificiais (RNAs) têm um papel fundamental

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aseptic meningitis after Measles-Mumps-Rubella vaccine (MMR) is a well recognized complication, and different incidences have been observed in several studies. We retrospectively analyzed forty cases of aseptic meningitis, during a large public immunization campaign (1998) in Curitiba, Southern Brazil (590,609 people), admitted in our Service. The vaccine utilized was Leningrad-3-Zagreb mumps strain, Edmonston-Zagreb measles strain, and RA 27#3 rubella strain. In all county, a total number of 87 cases were reported, resulting in a incidence of 1.7 cases per 10,000 given doses . The mean age was 23.7 ± 12.8 years. The female:male ratio was 1.35:1. Severe headache with meningismus (92.5%), fever (87.5%), nausea/vomiting (82.5%) were the most common clinical findings. Three cases (7.5%) developed mild mumps. All patients underwent cerebrospinal fluid (CSF) tap with the following findings: mononuclear pleocytosis from 100 to 500 cells/mm³ in 17 cases (42.5%; 257.5 ± 260.6 cells/mm³); increased protein 28 cases (67.5%; 92.1 ± 76.9 mg/dL); glucose was normal in all cases (56.8 ± 11.2 mg/dL) except in 4 (10%) cases, which presented less than 44 mg/dL. All serological tests (latex to bacterial meningitis, Cryptococcus, cysticercosis, VDRL) and bacteriological cultures were negative. Virus identification were also negative in 8 samples. None of the patients had neurological deficits or related symptoms after one year of onset. We believe the benefit of vaccination clearly outweights the incidence of benign vaccine-associated meningitis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada à Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do Grau de Mestre em Engenharia da Soldadura

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Genética Molecular e Biomedicina

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A 66-year-old female with Streptococcus viridans aortic and tricuspid infective endocarditis develops, during the course of antibiotic therapy, rupture of a right coronary sinus of Valsalva aneurysm to the right ventricle. An urgent cardiac surgery is preformed with implantation of a mechanical aortic prosthesis and a right coronary sinus plasty. Six months later a huge aortic pseudoaneurysm is diagnosed and she is submitted to a second uneventful surgery. A review is done for the significant features with discussion of diagnosis and therapy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Primary cutaneous follicle center lymphoma (PCFCL) is characterized by a proliferation of follicle center cells in the skin. A definitive diagnosis is frequently delayed because of difficulties in interpretation of the histopathologic findings. It has an excellent prognosis with a 5-year survival over 95% and its risk of transformation has not been established. We describe a case report of man with a gastric diffuse large B-cell lymphoma (DLBCL) referred to our clinic because of nodules in the back that had gradually developed over a period of 10 years. A biopsy performed 3 years before was interpreted as reactive follicular hyperplasia. A new skin biopsy revealed a diffuse large B-cell lymphoma and immunoglobulin heavy chain gene rearrangements from the initial skin biopsy (PCBCL) and the DLBCL gastric biopsy were studied by polymerase chain reaction and an identical clonal rearrangement was detected which was highly suggestive of a transformation lymphoma.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Assessment of surgical performance is a must for every surgical practice nowadays and can be done by using scientific methods imported mostly from the Quality control tools that have been in use for long in industry. Surgical performance comprises several dimensions including clinical activity (mortality and morbidity as end points), academic activities, research and, more and more, efficiency. Stable long time results (efficacy), reducing error (safety) and meeting patient expectations (patient satisfaction) are among other performance components. This paper focus on the precise definitions of mortality and morbidity related to surgical activities and on the tools to evaluate patient complexity and assess pre operative risk. Some graphic representations are suggested to compare performance profiles of surgeons and to define individual performance profiles. Strong emphasis is put on pre operative risk assessment and its crucial role to interpret divergent surgical results. Where risk assessment is not possible or is unavailable, observed / expected ratios (O/E) for a given endpoint , be it mortality, length of stay or morbidity, must be established and routinely used to refer results and to identify performance outliers. Morbidity is being pointed out as a most valuable performance indicator in surgery because it is sensitive and comprises efficiency, safety and quality, at large.