969 resultados para Technology software


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La diarrea neonatal representa uno de los problemas sanitarios de mayor relevancia en las primeras semanas de vida del cerdo. Provoca importantes pérdidas económicas por morbilidad y mortalidad. El cultivo de enterocitos primarios representa una herramienta valiosa para el estudio de patologías causadas por agentes infecciosos que afectan la integridad del epitelio intestinal. La producción de anticuerpos extraídos a partir de la yema de huevo de gallinas inmunizadas (IgY), es una tecnología innovadora, que ha mostrado ser protectiva contra diarreas causadas por agentes víricos y bacterianos. La nanotecnología permite mejorar la eficiencia en la administración de distintas drogas. Los nanotubos de carbono han ganado una enorme popularidad por sus propiedades y aplicaciones únicas. La investigación sobre los aspectos toxicológicos de estas nanopartículas es escasa. Una vez dentro de la célula, las nanopartículas pueden inducir estrés oxidativo intracelular por perturbar el equilibrio oxidativo. Las hipótesis de trabajo es: La administración de IgY anti-Escherichia coli a través de nanotubos protegerá in vitro e in vivo a los enterocitos de una infección por E. coli previniendo la diarrea neonatal porcina. Los objetivos del trabajo son: Evaluar la protección por un anticuerpo aviario IgY anti-E. coli aplicado mediante nanotubos de carbono a cultivo de enterocitos porcinos primarios sometidos a una post-infección con E. coli; Analizar los efectos secundarios de los nanotubos con IgY anti-E coli en la citotoxicidad, el balance oxidativo y la apoptosis de los enterocitos porcinos cultivados in vitro y Evaluar la acción terapeútica de la IgY anti-E coli aplicada a porcinos y efectos secundarios de la administración con nanotubos. Se implementará un diseño experimental in vitro con diferentes grupos de cultivos con nanotubos, con IgY anti-E. coli e inespecifica y con exposición a E. coli. Se realizará cultivo de enterocitos porcinos primarios con una técnica de disgregación enzimática con colagenasa según protocolo de Bader et al. (2000). Se evaluará la viabilidad por la prueba de azul tripan. Para la obtención del anticuerpo anti-E. coli aviario se aplicarán un total de 3 dosis de E. coli (109 UFC/ml de adyuvante) a gallinas Legorhn en condiciones fisiológicas. Se recolectarán los huevos diariamente. Se purificará la IgY según método de Polson et al. (1985) utilizando PEG 6000. La concentración de IgY se medirá por ELISA de alta sensibilidad. La IgY será incorporada a nanotubos según protocolo de Acevedo et al. 2006. Para analizar los posibles efectos secundarios de los nanotubos se evaluará: 1. Citotoxicidad por técnica de MTT 2. Estrés oxidativo por técnica de TBARS y 3. Apoptosis por técnica de TUNEL.Además, se implementará un diseño experimental in vivo para probar la acción terapeútica de este nutraceútico aplicados a lechones destetados y los efectos secundarios de la administración con nanotubos. Se realizará un cultivo de enterocitos de lechones que previamente fueron tratados con la IgY anti-E. coli administrada mediante nanotubos y efectuarán las técnicas descriptas anteriormente. Los resultados esperados son: Elaboración de un Ac aviario IgY anti-E. coli para prevenir infección de enterocitos, Profundización en el conocimiento acerca de los efectos citotóxicos de los nanotubos de carbono multilamelares, Generación de tratamiento alternativo para enfermedades entéricas porcinas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Identificación y caracterización del problema. Uno de los problemas más importantes asociados con la construcción de software es la corrección del mismo. En busca de proveer garantías del correcto funcionamiento del software, han surgido una variedad de técnicas de desarrollo con sólidas bases matemáticas y lógicas conocidas como métodos formales. Debido a su naturaleza, la aplicación de métodos formales requiere gran experiencia y conocimientos, sobre todo en lo concerniente a matemáticas y lógica, por lo cual su aplicación resulta costosa en la práctica. Esto ha provocado que su principal aplicación se limite a sistemas críticos, es decir, sistemas cuyo mal funcionamiento puede causar daños de magnitud, aunque los beneficios que sus técnicas proveen son relevantes a todo tipo de software. Poder trasladar los beneficios de los métodos formales a contextos de desarrollo de software más amplios que los sistemas críticos tendría un alto impacto en la productividad en tales contextos. Hipótesis. Contar con herramientas de análisis automático es un elemento de gran importancia. Ejemplos de esto son varias herramientas potentes de análisis basadas en métodos formales, cuya aplicación apunta directamente a código fuente. En la amplia mayoría de estas herramientas, la brecha entre las nociones a las cuales están acostumbrados los desarrolladores y aquellas necesarias para la aplicación de estas herramientas de análisis formal sigue siendo demasiado amplia. Muchas herramientas utilizan lenguajes de aserciones que escapan a los conocimientos y las costumbres usuales de los desarrolladores. Además, en muchos casos la salida brindada por la herramienta de análisis requiere cierto manejo del método formal subyacente. Este problema puede aliviarse mediante la producción de herramientas adecuadas. Otro problema intrínseco a las técnicas automáticas de análisis es cómo se comportan las mismas a medida que el tamaño y complejidad de los elementos a analizar crece (escalabilidad). Esta limitación es ampliamente conocida y es considerada crítica en la aplicabilidad de métodos formales de análisis en la práctica. Una forma de atacar este problema es el aprovechamiento de información y características de dominios específicos de aplicación. Planteo de objetivos. Este proyecto apunta a la construcción de herramientas de análisis formal para contribuir a la calidad, en cuanto a su corrección funcional, de especificaciones, modelos o código, en el contexto del desarrollo de software. Más precisamente, se busca, por un lado, identificar ambientes específicos en los cuales ciertas técnicas de análisis automático, como el análisis basado en SMT o SAT solving, o el model checking, puedan llevarse a niveles de escalabilidad superiores a los conocidos para estas técnicas en ámbitos generales. Se intentará implementar las adaptaciones a las técnicas elegidas en herramientas que permitan su uso a desarrolladores familiarizados con el contexto de aplicación, pero no necesariamente conocedores de los métodos o técnicas subyacentes. Materiales y métodos a utilizar. Los materiales a emplear serán bibliografía relevante al área y equipamiento informático. Métodos. Se emplearán los métodos propios de la matemática discreta, la lógica y la ingeniería de software. Resultados esperados. Uno de los resultados esperados del proyecto es la individualización de ámbitos específicos de aplicación de métodos formales de análisis. Se espera que como resultado del desarrollo del proyecto surjan herramientas de análisis cuyo nivel de usabilidad sea adecuado para su aplicación por parte de desarrolladores sin formación específica en los métodos formales utilizados. Importancia del proyecto. El principal impacto de este proyecto será la contribución a la aplicación práctica de técnicas formales de análisis en diferentes etapas del desarrollo de software, con la finalidad de incrementar su calidad y confiabilidad. A crucial factor for software quality is correcteness. Traditionally, formal approaches to software development concentrate on functional correctness, and tackle this problem basically by being based on well defined notations founded on solid mathematical grounds. This makes formal methods better suited for analysis, due to their precise semantics, but they are usually more complex, and require familiarity and experience with the manipulation of mathematical definitions. So, their acceptance by software engineers is rather restricted, and formal methods applications have been confined to critical systems. Nevertheless, it is obvious that the advantages that formal methods provide apply to any kind of software system. It is accepted that appropriate software tool support for formal analysis is essential, if one seeks providing support for software development based on formal methods. Indeed, some of the relatively recent sucesses of formal methods are accompanied by good quality tools that automate powerful analysis mechanisms, and are even integrated in widely used development environments. Still, most of these tools either concentrate on code analysis, and in many cases are still far from being simple enough to be employed by software engineers without experience in formal methods. Another important problem for the adoption of tool support for formal methods is scalability. Automated software analysis is intrinsically complex, and thus techniques do not scale well in the general case. In this project, we will attempt to identify particular modelling, design, specification or coding activities in software development processes where to apply automated formal analysis techniques. By focusing in very specific application domains, we expect to find characteristics that might be exploited to increase the scalability of the corresponding analyses, compared to the general case.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El avance en la potencia de cómputo en nuestros días viene dado por la paralelización del procesamiento, dadas las características que disponen las nuevas arquitecturas de hardware. Utilizar convenientemente este hardware impacta en la aceleración de los algoritmos en ejecución (programas). Sin embargo, convertir de forma adecuada el algoritmo en su forma paralela es complejo, y a su vez, esta forma, es específica para cada tipo de hardware paralelo. En la actualidad los procesadores de uso general más comunes son los multicore, procesadores paralelos, también denominados Symmetric Multi-Processors (SMP). Hoy en día es difícil hallar un procesador para computadoras de escritorio que no tengan algún tipo de paralelismo del caracterizado por los SMP, siendo la tendencia de desarrollo, que cada día nos encontremos con procesadores con mayor numero de cores disponibles. Por otro lado, los dispositivos de procesamiento de video (Graphics Processor Units - GPU), a su vez, han ido desarrollando su potencia de cómputo por medio de disponer de múltiples unidades de procesamiento dentro de su composición electrónica, a tal punto que en la actualidad no es difícil encontrar placas de GPU con capacidad de 200 a 400 hilos de procesamiento paralelo. Estos procesadores son muy veloces y específicos para la tarea que fueron desarrollados, principalmente el procesamiento de video. Sin embargo, como este tipo de procesadores tiene muchos puntos en común con el procesamiento científico, estos dispositivos han ido reorientándose con el nombre de General Processing Graphics Processor Unit (GPGPU). A diferencia de los procesadores SMP señalados anteriormente, las GPGPU no son de propósito general y tienen sus complicaciones para uso general debido al límite en la cantidad de memoria que cada placa puede disponer y al tipo de procesamiento paralelo que debe realizar para poder ser productiva su utilización. Los dispositivos de lógica programable, FPGA, son dispositivos capaces de realizar grandes cantidades de operaciones en paralelo, por lo que pueden ser usados para la implementación de algoritmos específicos, aprovechando el paralelismo que estas ofrecen. Su inconveniente viene derivado de la complejidad para la programación y el testing del algoritmo instanciado en el dispositivo. Ante esta diversidad de procesadores paralelos, el objetivo de nuestro trabajo está enfocado en analizar las características especificas que cada uno de estos tienen, y su impacto en la estructura de los algoritmos para que su utilización pueda obtener rendimientos de procesamiento acordes al número de recursos utilizados y combinarlos de forma tal que su complementación sea benéfica. Específicamente, partiendo desde las características del hardware, determinar las propiedades que el algoritmo paralelo debe tener para poder ser acelerado. Las características de los algoritmos paralelos determinará a su vez cuál de estos nuevos tipos de hardware son los mas adecuados para su instanciación. En particular serán tenidos en cuenta el nivel de dependencia de datos, la necesidad de realizar sincronizaciones durante el procesamiento paralelo, el tamaño de datos a procesar y la complejidad de la programación paralela en cada tipo de hardware. Today´s advances in high-performance computing are driven by parallel processing capabilities of available hardware architectures. These architectures enable the acceleration of algorithms when thes ealgorithms are properly parallelized and exploit the specific processing power of the underneath architecture. Most current processors are targeted for general pruposes and integrate several processor cores on a single chip, resulting in what is known as a Symmetric Multiprocessing (SMP) unit. Nowadays even desktop computers make use of multicore processors. Meanwhile, the industry trend is to increase the number of integrated rocessor cores as technology matures. On the other hand, Graphics Processor Units (GPU), originally designed to handle only video processing, have emerged as interesting alternatives to implement algorithm acceleration. Current available GPUs are able to implement from 200 to 400 threads for parallel processing. Scientific computing can be implemented in these hardware thanks to the programability of new GPUs that have been denoted as General Processing Graphics Processor Units (GPGPU).However, GPGPU offer little memory with respect to that available for general-prupose processors; thus, the implementation of algorithms need to be addressed carefully. Finally, Field Programmable Gate Arrays (FPGA) are programmable devices which can implement hardware logic with low latency, high parallelism and deep pipelines. Thes devices can be used to implement specific algorithms that need to run at very high speeds. However, their programmability is harder that software approaches and debugging is typically time-consuming. In this context where several alternatives for speeding up algorithms are available, our work aims at determining the main features of thes architectures and developing the required know-how to accelerate algorithm execution on them. We look at identifying those algorithms that may fit better on a given architecture as well as compleme

Relevância:

20.00% 20.00%

Publicador:

Resumo:

All organisations make some contribution to the degradation of the environment through their use of resources and production of waste. Environmental management systems (EMS) standards can provide a tool for companies to systematically reduce their environmental impacts. ISO 14001 was published in 1996. This fitted in with plans of the case study company to take proactive action in this area, even though there was no legislative requirement for them to do so. As EMS implementation was a new area at the time, appropriate methodologies were developed to address different aspects of the implementation, and ISO 14001 was successfully implemented in the company. The results of the primary research included: ♦ Drawing up a methodology for identifying and interpreting the environmental legislation that may have an impact on the organisation and compiling a register of such regulations. ♦ Developing a robust methodology for assessing significant environmental aspects and impacts and applying this to the software company. ♦ Establishing objectives and targets for those aspects identified as significant and implementing environmental management programmes to meet these. ♦ Developing an internal environmental audit procedure based on auditing against the significant aspects. ♦ Integrating areas of the EMS with the existing quality management system in order to avoid duplication of effort. ♦ Undergoing an external assessment process in order to achieve certification of the system. The thesis concludes that the systematic approach defined in ISO 14001 provided a mechanism that the organisation was able to adopt to bring about improvement in its environmental performance. The system was based on a thorough evaluation of the organisation's significant environmental aspects in order to bring about a reduction in its negative impacts. The ISO 14001 requirement for continual improvement is the key driver of the system, and this is what differentiates it from ISO 9000.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The sustained economic growth that has been experienced in the Irish economy in recent years has relied, to a large extent, on the contribution and performance of those industry sectors that possess the ability to provide high-value-added products and services to domestic and international markets. One such contributor has been the Technology sector. However, the performance of this sector relies upon the availability of the necessary capabilities and competencies for Technology companies to remain competitive. The Expert Group on Future Skills Needs have forecasted future skills shortages in this sector. The purpose of this research has been to examine the extent to which Irish Technology companies are taking measures to meet changing skills requirements, through training and development interventions. Survey research methods (in the form of a mail questionnaire, supported by a Web-based questionnaire) have been used to collect information on the expenditure on, and approach to, training and development in these companies, in addition to the methods, techniques and tools/aids that are used to support the delivery of these activities. The contribution of Government intervention has also been examined. The conclusions have been varied. When the activities of the responding companies are considered in isolation, the picture to emerge is primarily positive. Although the expenditure on training and development is slightly lower than that indicated in previous studies, the results vary by company size. Technical employees are clearly the key focus of training provision, while Senior Managers and Directors, Clerical and Administrative staff and Manual workers are a great deal more neglected in training provision. Expenditure on, and use of, computer-based training methods is high, as is the use of most of the specified techniques for facilitating learning. However, when one considers the extent to which external support (in the form of Government interventions and cooperation with other companies and with education and training providers) is integrated into the overall training practices of these companies, significant gaps in practice are identified. The thesis concludes by providing a framework to guide future training and development practices in the Technology sector.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis is a continuation of the Enterprise-Ireland Research Innovation Fund (RIF) Project entitled’ "Design and Manufacturing of Customised Maxillo-Facial Prostheses" The primary objective of this Internal Research Development Program (IRDP) project was to investigate two fundamental design changes 1 To incorporate the over-denture abutments directly into the implant. 2 To remove the restraining wings by the addition of screws, which affix the. implant to the dense material of the jawbone. The prosthetic was redesigned using the ANSYS Finite Element Analysis software program and analysed to* • Reduce the internal von Mises stress distribution The new prosthetic had a -63.63 % lower von Mises stress distribution when compared with the original prosthetic. • Examine the screw preload effects. A maximum relative displacement of 22 6 * lO^mm between the bone and screw was determined, which is well below the critical threshold of micromotion which prevents osseointegration • Investigate the prosthetic-bone contact interface. Three models of the screw, prosthesis, and bone, were studied. (Axisymmetnc, quarter volume, and full volume), a recommended preload torque of 0 32 Nm was applied to the prosthetic and a maximum von Mises stress of 1.988 MPa was predicted • Study the overdenture removal forces. This analysis could not be completed because the correct plastic multilinear properties of the denture material could not be established The redesigned prosthetic was successfully manufactured on a 3-axis milling machine with an indexing system The prosthetic was examined for dimensional quality and strength The research established the feasibility of the new design and associated manufacturing method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This is a study of a state of the art implementation of a new computer integrated testing (CIT) facility within a company that designs and manufactures transport refrigeration systems. The aim was to use state of the art hardware, software and planning procedures in the design and implementation of three CIT systems. Typical CIT system components include data acquisition (DAQ) equipment, application and analysis software, communication devices, computer-based instrumentation and computer technology. It is shown that the introduction of computer technology into the area of testing can have a major effect on such issues as efficiency, flexibility, data accuracy, test quality, data integrity and much more. Findings reaffirm how the overall area of computer integration continues to benefit any organisation, but with more recent advances in computer technology, communication methods and software capabilities, less expensive more sophisticated test solutions are now possible. This allows more organisations to benefit from the many advantages associated with CIT. Examples of computer integration test set-ups and the benefits associated with computer integration have been discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The research described in this thesis was developed as part of the Information Management for Green Design (IMAGREE) Project. The IMAGREE Project was funded by Enterprise Ireland under Strategic Research Grant Scheme as a partnership project between Galway-Mayo Institute of Technology and CIMRU University of Galway. The project aimed to develop a CAD integrated software tool to support environmental information management for design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The research described in this thesis has been developed as a part of the Reliability and Field Data Management for Multi-component Products (REFIDAM) Project. This project was founded under the Applied Research Grants Scheme administered by Enterprise Ireland. The project was a partnership between Galway-Mayo Institute of Technology and Thermo King Europe. The project aimed to develop a system in order to manage the information required for reliability assessment and improvement of multi-component products, by establishing information flows within the company and information exchange with fleet users.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This project was funded under the Applied Research Grants Scheme administered by Enterprise Ireland. The project was a partnership between Galway - Mayo Institute of Technology and an industrial company, Tyco/Mallinckrodt Galway. The project aimed to develop a semi - automatic, self - learning pattern recognition system capable of detecting defects on the printed circuits boards such as component vacancy, component misalignment, component orientation, component error, and component weld. The research was conducted in three directions: image acquisition, image filtering/recognition and software development. Image acquisition studied the process of forming and digitizing images and some fundamental aspects regarding the human visual perception. The importance of choosing the right camera and illumination system for a certain type of problem has been highlighted. Probably the most important step towards image recognition is image filtering, The filters are used to correct and enhance images in order to prepare them for recognition. Convolution, histogram equalisation, filters based on Boolean mathematics, noise reduction, edge detection, geometrical filters, cross-correlation filters and image compression are some examples of the filters that have been studied and successfully implemented in the software application. The software application developed during the research is customized in order to meet the requirements of the industrial partner. The application is able to analyze pictures, perform the filtering, build libraries, process images and generate log files. It incorporates most of the filters studied and together with the illumination system and the camera it provides a fully integrated framework able to analyze defects on printed circuit boards.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The impending introduction of lead-free solder in the manufacture of electrical and electronic products has presented the electronics industry with many challenges. European manufacturers must transfer from a tin-lead process to a lead-free process by July 2006 as a result of the publication of two directives from the European Parliament. Tin-lead solders have been used for mechanical and electrical connections on printed circuit boards for over fifty years and considerable process knowledge has been accumulated. Extensive literature reviews were conducted on the topic and as a result it was found there are many implications to be considered with the introduction of lead-free solder. One particular question that requires answering is; can lead-free solder be used in existing manufacturing processes? The purpose of this research is to conduct a comparative study of a tin-lead solder and a lead-free solder in two key surface mount technology (SMT) processes. The two SMT processes in question were the stencil printing process and the reflow soldering process. Unreplicated fractional factorial experimental designs were used to carry out the studies. The quality of paste deposition in terms of height and volume were the characteristics of interest in the stencil printing process. The quality of solder joints produced in the reflow soldering experiment was assessed using x-ray and cross sectional analysis. This provided qualitative data that was then uniquely scored and weighted using a method developed during the research. Nested experimental design techniques were then used to analyse the resulting quantitative data. Predictive models were developed that allowed for the optimisation of both processes. Results from both experiments show that solder joints of comparable quality to those produced using tin-lead solder can be produced using lead-free solder in current SMT processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

FUNDAMENTO: A estandardização do padrão de imagens utilizada dentro da medicina foi realizada em 1993 por meio do padrão DICOM (Digital Imaging and Communications in Medicine). Diversos exames utilizam esse padrão e cada vez mais são necessários softwares capazes de manipular esse tipo de imagem, porém esses softwares geralmente não têm o formato livre e de código aberto, e isso dificulta o seu ajuste para os mais diferentes interesses. OBJETIVO: Desenvolver e validar um software livre e de código aberto capaz de manipular imagens DICOM de exames de angiotomografia de coronárias. MÉTODOS: Desenvolvemos e testamos o software intitulado ImageLab na avaliação de 100 exames selecionados de forma randômica por meio de um banco de dados. Foram realizadas 600 análises divididas por dois observadores utilizando o ImageLab e um outro software comercializado junto a aparelhos de tomografia computadorizada Philips Brilliance, na avaliação da presença de lesões e placas coronarianas nos territórios do Tronco da Coronária Esquerda (TCE) e na Artéria Descendente Anterior (ADA). Para avaliar as concordâncias intraobservador, interobservadores e intersoftware, utilizamos concordância simples e estatística Kappa. RESULTADOS: As concordâncias observadas entre os softwares foram em geral classificadas como substancial ou quase perfeitas na maioria das comparações. CONCLUSÃO: O software ImageLab concordou com o software Philips na avaliação de exames de angiotomografia de coronárias especialmente em pacientes sem lesões, com lesões inferiores a 50% no TCE e inferiores a 70% na ADA. A concordância para lesão >70% na ADA foi menor, porém isso também é observado quando se utiliza o padrão de referência anatômico.