924 resultados para Calculation tool in reliability
Resumo:
Linkage disequilibrium (LD) mapping is commonly used as a fine mapping tool in human genome mapping and has been used with some success for initial disease gene isolation in certain isolated inbred human populations. An understanding of the population history of domestic dog breeds suggests that LID mapping could be routinely utilized in this species for initial genome-wide scans. Such an approach offers significant advantages over traditional linkage analysis. Here, we demonstrate, using canine copper toxicosis in the Bedlington terrier as the model, that LID mapping could be reasonably expected to be a useful strategy in low-resolution, genome-wide scans in pure-bred dogs. Significant LID was demonstrated over distances up to 33.3 cM. It is very unlikely, for a number of reasons discussed, that this result could be extrapolated to the rest of the genome. It is, however, consistent with the expectation given the population structure of canine breeds and, in this breed at least, with the hypothesis that it may be possible to utilize LID in a genome-wide scan. In this study, LD mapping confirmed the location of the copper toxicosis in Bedlington terrier gene (CT-BT) and was able to do so in a population that was refractory to traditional linkage analysis.
Resumo:
Two methods were compared for determining the concentration of penetrative biomass during growth of Rhizopus oligosporus on an artificial solid substrate consisting of an inert gel and starch as the sole source of carbon and energy. The first method was based on the use of a hand microtome to make sections of approximately 0.2- to 0.4-mm thickness parallel to the substrate surface and the determination of the glucosamine content in each slice. Use of glucosamine measurements to estimate biomass concentrations was shown to be problematic due to the large variations in glucosamine content with mycelial age. The second method was a novel method based on the use of confocal scanning laser microscopy to estimate the fractional volume occupied by the biomass. Although it is not simple to translate fractional volumes into dry weights of hyphae due to the lack of experimentally determined conversion factors, measurement of the fractional volumes in themselves is useful for characterizing fungal penetration into the substrate. Growth of penetrative biomass in the artificial model substrate showed two forms of growth with an indistinct mass in the region close to the substrate surface and a few hyphae penetrating perpendicularly to the surface in regions further away from the substrate surface. The biomass profiles against depth obtained from the confocal microscopy showed two linear regions on log-linear plots, which are possibly related to different oxygen availability at different depths within the substrate. Confocal microscopy has the potential to be a powerful tool in the investigation of fungal growth mechanisms in solid-state fermentation. (C) 2003 Wiley Periodicals, Inc.
Resumo:
One of the current frontiers in the clinical management of Pectus Excavatum (PE) patients is the prediction of the surgical outcome prior to the intervention. This can be done through computerized simulation of the Nuss procedure, which requires an anatomically correct representation of the costal cartilage. To this end, we take advantage of the costal cartilage tubular structure to detect it through multi-scale vesselness filtering. This information is then used in an interactive 2D initialization procedure which uses anatomical maximum intensity projections of 3D vesselness feature images to efficiently initialize the 3D segmentation process. We identify the cartilage tissue centerlines in these projected 2D images using a livewire approach. We finally refine the 3D cartilage surface through region-based sparse field level-sets. We have tested the proposed algorithm in 6 noncontrast CT datasets from PE patients. A good segmentation performance was found against reference manual contouring, with an average Dice coefficient of 0.75±0.04 and an average mean surface distance of 1.69±0.30mm. The proposed method requires roughly 1 minute for the interactive initialization step, which can positively contribute to an extended use of this tool in clinical practice, since current manual delineation of the costal cartilage can take up to an hour.
Resumo:
In an increasingly complex society, regulatory polices emerge as an important tool in public management. Nevertheless, regulation per se is no longer enough, and the agenda for a regulatory reform is increasing. Following this context, Brazil has implemented Regulatory Impact Analysis (RIA) in its regulatory agencies. Thus, Brazilian specificities have to be considered and, in this regard, a systematic approach provides a significant contribution. This article aims to address some critical reflections about which policy-makers should ask themselves before joining the implementation of a RIA system in the Brazilian context. Through a long-term perspective, the implementation of RIA must be seen as part of a permanent change in the administrative culture, understanding that RIA should be used as a further resource in the decision-making process, rather than a final solution.
Resumo:
As teachers, we are challenged everyday to solve pedagogical problems and we have to fight for our students’ attention in a media rich world. I will talk about how we use ICT in Initial Teacher Training and give you some insight on what we are doing. The most important benefit of using ICT in education is that it makes us reflect on our practice. There is no doubt that our classrooms need to be updated, but we need to be critical about every peace of hardware, software or service that we bring into them. It is not only because our budgets are short, but also because e‐learning is primarily about learning, not technology. Therefore, we need to have the knowledge and skills required to act in different situations, and choose the best tool for the job. Not all subjects are suitable for e‐learning, nor do all students have the skills to organize themselves their own study times. Also not all teachers want to spend time programming or learning about instructional design and metadata. The promised land of easy use of authoring tools (e.g. eXe and Reload) that will lead to all teachers become Learning Objects authors and share these LO in Repositories, all this failed, like previously HyperCard, Toolbook and others. We need to know a little bit of many different technologies so we can mobilize this knowledge when a situation requires it: integrate e‐learning technologies in the classroom, not a flipped classroom, just simple tools. Lecture capture, mobile phones and smartphones, pocket size camcorders, VoIP, VLE, live video broadcast, screen sharing, free services for collaborative work, save, share and sync your files. Do not feel stressed to use everything, every time. Just because we have a whiteboard does not mean we have to make it the centre of the classroom. Start from where you are, with your preferred subject and the tools you master. Them go slowly and try some new tool in a non‐formal situation and with just one or two students. And you don’t need to be alone: subscribe a mailing list and share your thoughts with other teachers in a dedicated forum, even better if both are part of a community of practice, and share resources. We did that for music teachers and it was a success, in two years arriving at 1.000 members. Just do it.
Resumo:
The rapid growth in genetics and molecular biology combined with the development of techniques for genetically engineering small animals has led to increased interest in in vivo small animal imaging. Small animal imaging has been applied frequently to the imaging of small animals (mice and rats), which are ubiquitous in modeling human diseases and testing treatments. The use of PET in small animals allows the use of subjects as their own control, reducing the interanimal variability. This allows performing longitudinal studies on the same animal and improves the accuracy of biological models. However, small animal PET still suffers from several limitations. The amounts of radiotracers needed, limited scanner sensitivity, image resolution and image quantification issues, all could clearly benefit from additional research. Because nuclear medicine imaging deals with radioactive decay, the emission of radiation energy through photons and particles alongside with the detection of these quanta and particles in different materials make Monte Carlo method an important simulation tool in both nuclear medicine research and clinical practice. In order to optimize the quantitative use of PET in clinical practice, data- and image-processing methods are also a field of intense interest and development. The evaluation of such methods often relies on the use of simulated data and images since these offer control of the ground truth. Monte Carlo simulations are widely used for PET simulation since they take into account all the random processes involved in PET imaging, from the emission of the positron to the detection of the photons by the detectors. Simulation techniques have become an importance and indispensable complement to a wide range of problems that could not be addressed by experimental or analytical approaches.
Resumo:
Energy resource scheduling becomes increasingly important, as the use of distributed resources is intensified and massive gridable vehicle use is envisaged. The present paper proposes a methodology for dayahead energy resource scheduling for smart grids considering the intensive use of distributed generation and of gridable vehicles, usually referred as Vehicle- o-Grid (V2G). This method considers that the energy resources are managed by a Virtual Power Player (VPP) which established contracts with V2G owners. It takes into account these contracts, the user´s requirements subjected to the VPP, and several discharge price steps. Full AC power flow calculation included in the model allows taking into account network constraints. The influence of the successive day requirements on the day-ahead optimal solution is discussed and considered in the proposed model. A case study with a 33 bus distribution network and V2G is used to illustrate the good performance of the proposed method.
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores
Resumo:
Dissertação de Mestrado apresentado ao Instituto de Contabilidade e Administração do Porto para a obtenção do grau de Mestre em Auditoria, sob orientação da Doutora Alcina Augusta de Sena Portugal Dias
Resumo:
Introduction Myocardial Perfusion Imaging (MPI) is a very important tool in the assessment of Coronary Artery Disease ( CAD ) patient s and worldwide data demonstrate an increasingly wider use and clinical acceptance. Nevertheless, it is a complex process and it is quite vulnerable concerning the amount and type of possible artefacts, some of them affecting seriously the overall quality and the clinical utility of the obtained data. One of the most in convenient artefacts , but relatively frequent ( 20% of the cases ) , is relate d with patient motion during image acquisition . Mostly, in those situations, specific data is evaluated and a decisi on is made between A) accept the results as they are , consider ing that t he “noise” so introduced does not affect too seriously the final clinical information, or B) to repeat the acquisition process . Another possib ility could be to use the “ Motion Correcti on Software” provided within the software package included in any actual gamma camera. The aim of this study is to compare the quality of the final images , obtained after the application of motion correction software and after the repetition of image acqui sition. Material and Methods Thirty cases of MPI affected by Motion Artefacts and repeated , were used. A group of three, independent (blinded for the differences of origin) expert Nuclear Medicine Clinicians had been invited to evaluate the 30 sets of thre e images - one set for each patient - being ( A) original image , motion uncorrected , (B) original image, motion corrected, and (C) second acquisition image, without motion . The results so obtained were statistically analysed . Results and Conclusion Results obtained demonstrate that the use of the Motion Correction Software is useful essentiall y if the amplitude of movement is not too important (with this specific quantification found hard to define precisely , due to discrepancies between clinicians and other factors , namely between one to another brand); when that is not the case and the amplitude of movement is too important , the n the percentage of agreement between clinicians is much higher and the repetition of the examination is unanimously considered ind ispensable.
Resumo:
A presente dissertação tem como principal objectivo estimar as emissões de carbono resultantes das actividades da Monteiro, Ribas- Embalagens Flexíveis, S.A. A realização do inventário de gases de efeito estufa permite que a Monteiro, Ribas- Embalagens Flexíveis, S.A, identifique quais as suas fontes emissoras e quantifique as emissões de gases de efeito estufa, permitindo criar estratégias de redução das mesmas. A elaboração do inventário foi fundamentada nas directrizes do Greenhouse Gas Protocol, obedecendo aos princípios de relevância, integrabilidade, consistência, transparência e exactidão. A metodologia adoptada utiliza factores de emissão documentados para efectuar o cálculo das emissões de gases de efeito de estufa (GEE). Estes factores são rácios que relacionam as emissões de GEE com dados de actividade específicos para cada fonte de emissão. Como emissões directas (âmbito 1), foram quantificadas as emissões provenientes do uso de gás natural nas caldeiras, consumo de vapor e de água quente, e as emissões do veículo comercial da empresa. Como emissões indirectas de âmbito 2, incluem-se as resultantes da electricidade consumida. As emissões indirectas estimadas de âmbito 3 referem-se, no caso em estudo, ao transporte de resíduos, ao deslocamento de funcionários para a empresa e às viagens de negócio. Face ao tipo de emissões identificadas, criou-se uma ferramenta de cálculo que contém todos os valores de factores de emissão que podem ser utilizados em função das características específicas dos dados de actividade relativos às várias fontes emissoras da Empresa. Esta ferramenta permitirá, no futuro, aperfeiçoar o cálculo das emissões, a partir de uma melhor sistematização da informação disponível. Com este trabalho também foi possível identificar a necessidade de recolher e organizar alguma informação complementar à já existente. O ano base considerado foi 2011. Os resultados obtidos mostram que, neste ano, as actividades da Monteiro, Ribas- Embalagens Flexíveis, S.A serão responsáveis pela emissão de 2968,6 toneladas de CO2e (dióxido de carbono equivalente). De acordo com a Decisão 2007/589/CE da Comissão de 18 de Julho de 2007 conclui-se que a Monteiro, Ribas Embalagens e Flexíveis S.A. se enquadra na categoria de instalações com baixo níveis de emissões pois as suas emissões médias anuais são inferiores a 25000 toneladas de CO2e. Conclui-se que a percentagem maior das emissões estimadas (50,7 %) é proveniente do consumo de electricidade (emissões indirectas, âmbito 2), seguida pelo consumo de gás natural (emissões directas) que representa 39,4% das emissões. Relacionando os resultados obtidos com a produção total da Monteiro, Ribas- Embalagens Flexíveis, S.A, em 2011, obtém-se o valor de 0,65 kg de CO2e por cada quilograma de produto final. Algumas das fontes emissoras identificadas não foram incorporadas no inventário da empresa, nomeadamente o transporte das matérias-primas e dos produtos. Isto deve-se ao facto de não ter sido possível compilar a informação necessária, em tempo útil. Apesar de se tratar de emissões indirectas de âmbito 3, consideradas opcionais, recomenda-se que num próximo trabalho deste tipo, essas emissões possam vir a ser quantificadas. As principais incertezas associadas às estimativas de emissão dizem respeito aos dados de actividade uma vez que foi a primeira vez que a empresa realizou um inventário de gases de efeito de estufa. Há informações mais específicas sobre os dados de actividade que a empresa dispõe e que poderá, de futuro, sistematizar de uma forma mais adequada para a sua utilização com este fim.
Resumo:
O trabalho presente nesta dissertação incidiu sobre a aplicação das metodologias Lean no âmbito da manutenção de uma empresa metalomecânica de produção de Moldes – Simoldes Aços. No atual enquadramento, com os mercados nacionais e internacionais debaixo de feroz competição, as empresas são obrigadas a estudar métodos e técnicas que permitam eliminar desperdícios, reduzir custos e tempos de produção, ao mesmo tempo que são exigidos maiores níveis de qualidade dos produtos fabricados com vista ao aumento da competitividade. Sendo a Manutenção uma área funcional com um impacto elevado no desempenho da produção, é percebido que o desempenho desta, tem influência direta no comportamento do fluxo produtivo e nos respetivos níveis de eficácia e eficiência. No decorrer do trabalho desta dissertação de mestrado foi realizada uma análise abrangente do estado atual do sector de atividade de manutenção na empresa SIMOLDES SA, o que permitiu identificar as áreas e os pontos a intervir e desenhar as soluções de melhoria na atividade de manutenção. Na fase concludente do trabalho foram implementadas algumas dessas propostas de melhoria, ao passo que outras ficaram agendadas para futura implementação. Na base do trabalho desenvolvido esteve a metodologia Lean, que apresenta um papel relevante na implementação de uma abordagem integrada da função manutenção na manutenção dos objetivos da produção. O presente projeto baseou a sua estratégia de implementação na aplicação da ferramenta do 5S’ em paralelo com o TPM (Total Productive Maintenance). Ambas as ferramentas visam a redução de desperdícios e o aumento da fiabilidade dos processos, pelo aumento da disponibilidade dos equipamentos, da melhoria do desempenho dos processos e da plena integração de todos os colaboradores no processo de fabrico. Com a implementação das melhorias propostas, foram observados melhorias significativas no fluxo das atividades da manutenção, assim como uma maior visibilidade das mesmas em todo o processo produtivo.
Resumo:
Trabalho Final de Mestrado elaborado no Laboratório Nacional de Engenharia Civil (LNEC) para a obtenção do grau de Mestre em Engenharia Civil pelo Instituto Superior de Engenharia de Lisboa no âmbito do protocolo de cooperação entre o ISEL e o LNEC
Resumo:
Fractional Calculus (FC) goes back to the beginning of the theory of differential calculus. Nevertheless, the application of FC just emerged in the last two decades. It has been recognized the advantageous use of this mathematical tool in the modelling and control of many dynamical systems. Having these ideas in mind, this paper discusses a FC perspective in the study of the dynamics and control of several systems. The paper investigates the use of FC in the fields of controller tuning, legged robots, electrical systems and digital circuit synthesis.
Resumo:
Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática