965 resultados para Task technology fit
Resumo:
Personalized tissue engineering and regenerative medicine (TERM) therapies propose patient-oriented effective solutions, considering individual needs. Cell-based therapies, for example, may benefit from cell sources that enable easier autologous set-ups or from recent developments on IPS cells technologies towards effective personalized therapeutics. Furthermore, the customization of scaffold materials to perfectly fit a patientâ s tissue defect through rapid prototyping technologies, also known as 3D printing, is now a reality. Nevertheless, the timing to expand cells or to obtain functional in vitrotissue substitutes prior to implantation prevents advancements towards routine use upon patient´s needs. Thus, personalized therapies also anticipate the importance of creating off-the-shelf solutions to enable immediately available tissue engineered products. This paper reviews the main recent developments and future challenges to enable personalized TERM approaches and to bring these technologies closer to clinical applications.
Resumo:
Tese de Mestrado Ciclo de Estudos Integrados Conducentes ao Grau de Mestre em Arquitectura Área de Especialização: Construção e Tecnologia
Resumo:
Aromatic amines are widely used industrial chemicals as their major sources in the environment include several chemical industry sectors such as oil refining, synthetic polymers, dyes, adhesives, rubbers, perfume, pharmaceuticals, pesticides and explosives. They result also from diesel exhaust, combustion of wood chips and rubber and tobacco smoke. Some types of aromatic amines are generated during cooking, special grilled meat and fish, as well. The intensive use and production of these compounds explains its occurrence in the environment such as in air, water and soil, thereby creating a potential for human exposure. Since aromatic amines are potential carcinogenic and toxic agents, they constitute an important class of environmental pollutants of enormous concern, which efficient removal is a crucial task for researchers, so several methods have been investigated and applied. In this chapter the types and general properties of aromatic amine compounds are reviewed. As aromatic amines are continuously entering the environment from various sources and have been designated as high priority pollutants, their presence in the environment must be monitored at concentration levels lower than 30 mg L1, compatible with the limits allowed by the regulations. Consequently, most relevant analytical methods to detect the aromatic amines composition in environmental matrices, and for monitoring their degradation, are essential and will be presented. Those include Spectroscopy, namely UV/visible and Fourier Transform Infrared Spectroscopy (FTIR); Chromatography, in particular Thin Layer (TLC), High Performance Liquid (HPLC) and Gas chromatography (GC); Capillary electrophoresis (CE); Mass spectrometry (MS) and combination of different methods including GC-MS, HPLC-MS and CE-MS. Choosing the best methods depend on their availability, costs, detection limit and sample concentration, which sometimes need to be concentrate or pretreated. However, combined methods may give more complete results based on the complementary information. The environmental impact, toxicity and carcinogenicity of many aromatic amines have been reported and are emphasized in this chapter too. Lately, the conventional aromatic amines degradation and the alternative biodegradation processes are highlighted. Parameters affecting biodegradation, role of different electron acceptors in aerobic and anaerobic biodegradation and kinetics are discussed. Conventional processes including extraction, adsorption onto activated carbon, chemical oxidation, advanced oxidation, electrochemical techniques and irradiation suffer from drawbacks including high costs, formation of hazardous by-products and low efficiency. Biological processes, taking advantage of the naturally processes occurring in environment, have been developed and tested, proved as an economic, energy efficient and environmentally feasible alternative. Aerobic biodegradation is one of the most promising techniques for aromatic amines remediation, but has the drawback of aromatic amines autooxidation once they are exposed to oxygen, instead of their degradation. Higher costs, especially due to power consumption for aeration, can also limit its application. Anaerobic degradation technology is the novel path for treatment of a wide variety of aromatic amines, including industrial wastewater, and will be discussed. However, some are difficult to degrade under anaerobic conditions and, thus, other electron acceptors such as nitrate, iron, sulphate, manganese and carbonate have, alternatively, been tested.
Resumo:
The Paternal Adjustment and Paternal Attitudes Questionnaire (PAPA) was designed to assess paternal adjustment and paternal attitudes during the transition to parenthood. This study aimed to examine the psychometric characteristics of the Portuguese versions of the PAPA-Antenatal (PAPA-AN) and -Postnatal (PAPA-PN) versions. A nonclinical sample of 128 fathers was recruited in the obstetrics outpatient unit, and they completed both versions of the PAPA and selfreport measures of depressive and anxiety symptoms during pregnancy and the postpartum period, respectively. Good internal consistency for both PAPA-AN and PAPA-PN was found. A three-factor model was found for both versions of the instrument. Longitudinal confirmatory factor analysis revealed a good model fit. The PAPA-AN and PAPA-PN subscales revealed good internal consistency. Significant associations were found between PAPA (PAPA-AN and PAPA-PN) and depressive and anxiety symptoms, suggesting good criterion validity. Both versions also showed good clinical validity, with optimal cutoffs found. The present study suggested that the Portuguese versions of the PAPA are reliable multidimensional self-report measures of paternal adjustment and paternal attitudes that could be used to identify fathers with adjustment problems and negative attitudes during the transition to parenthood.
Resumo:
Dissertação de mestrado em Português Língua Não Materna (PLNM): Português Língua Estrangeira (PLE) Português Língua Segunda (PL2)
Resumo:
La diarrea neonatal representa uno de los problemas sanitarios de mayor relevancia en las primeras semanas de vida del cerdo. Provoca importantes pérdidas económicas por morbilidad y mortalidad. El cultivo de enterocitos primarios representa una herramienta valiosa para el estudio de patologías causadas por agentes infecciosos que afectan la integridad del epitelio intestinal. La producción de anticuerpos extraídos a partir de la yema de huevo de gallinas inmunizadas (IgY), es una tecnología innovadora, que ha mostrado ser protectiva contra diarreas causadas por agentes víricos y bacterianos. La nanotecnología permite mejorar la eficiencia en la administración de distintas drogas. Los nanotubos de carbono han ganado una enorme popularidad por sus propiedades y aplicaciones únicas. La investigación sobre los aspectos toxicológicos de estas nanopartículas es escasa. Una vez dentro de la célula, las nanopartículas pueden inducir estrés oxidativo intracelular por perturbar el equilibrio oxidativo. Las hipótesis de trabajo es: La administración de IgY anti-Escherichia coli a través de nanotubos protegerá in vitro e in vivo a los enterocitos de una infección por E. coli previniendo la diarrea neonatal porcina. Los objetivos del trabajo son: Evaluar la protección por un anticuerpo aviario IgY anti-E. coli aplicado mediante nanotubos de carbono a cultivo de enterocitos porcinos primarios sometidos a una post-infección con E. coli; Analizar los efectos secundarios de los nanotubos con IgY anti-E coli en la citotoxicidad, el balance oxidativo y la apoptosis de los enterocitos porcinos cultivados in vitro y Evaluar la acción terapeútica de la IgY anti-E coli aplicada a porcinos y efectos secundarios de la administración con nanotubos. Se implementará un diseño experimental in vitro con diferentes grupos de cultivos con nanotubos, con IgY anti-E. coli e inespecifica y con exposición a E. coli. Se realizará cultivo de enterocitos porcinos primarios con una técnica de disgregación enzimática con colagenasa según protocolo de Bader et al. (2000). Se evaluará la viabilidad por la prueba de azul tripan. Para la obtención del anticuerpo anti-E. coli aviario se aplicarán un total de 3 dosis de E. coli (109 UFC/ml de adyuvante) a gallinas Legorhn en condiciones fisiológicas. Se recolectarán los huevos diariamente. Se purificará la IgY según método de Polson et al. (1985) utilizando PEG 6000. La concentración de IgY se medirá por ELISA de alta sensibilidad. La IgY será incorporada a nanotubos según protocolo de Acevedo et al. 2006. Para analizar los posibles efectos secundarios de los nanotubos se evaluará: 1. Citotoxicidad por técnica de MTT 2. Estrés oxidativo por técnica de TBARS y 3. Apoptosis por técnica de TUNEL.Además, se implementará un diseño experimental in vivo para probar la acción terapeútica de este nutraceútico aplicados a lechones destetados y los efectos secundarios de la administración con nanotubos. Se realizará un cultivo de enterocitos de lechones que previamente fueron tratados con la IgY anti-E. coli administrada mediante nanotubos y efectuarán las técnicas descriptas anteriormente. Los resultados esperados son: Elaboración de un Ac aviario IgY anti-E. coli para prevenir infección de enterocitos, Profundización en el conocimiento acerca de los efectos citotóxicos de los nanotubos de carbono multilamelares, Generación de tratamiento alternativo para enfermedades entéricas porcinas.
Resumo:
El avance en la potencia de cómputo en nuestros días viene dado por la paralelización del procesamiento, dadas las características que disponen las nuevas arquitecturas de hardware. Utilizar convenientemente este hardware impacta en la aceleración de los algoritmos en ejecución (programas). Sin embargo, convertir de forma adecuada el algoritmo en su forma paralela es complejo, y a su vez, esta forma, es específica para cada tipo de hardware paralelo. En la actualidad los procesadores de uso general más comunes son los multicore, procesadores paralelos, también denominados Symmetric Multi-Processors (SMP). Hoy en día es difícil hallar un procesador para computadoras de escritorio que no tengan algún tipo de paralelismo del caracterizado por los SMP, siendo la tendencia de desarrollo, que cada día nos encontremos con procesadores con mayor numero de cores disponibles. Por otro lado, los dispositivos de procesamiento de video (Graphics Processor Units - GPU), a su vez, han ido desarrollando su potencia de cómputo por medio de disponer de múltiples unidades de procesamiento dentro de su composición electrónica, a tal punto que en la actualidad no es difícil encontrar placas de GPU con capacidad de 200 a 400 hilos de procesamiento paralelo. Estos procesadores son muy veloces y específicos para la tarea que fueron desarrollados, principalmente el procesamiento de video. Sin embargo, como este tipo de procesadores tiene muchos puntos en común con el procesamiento científico, estos dispositivos han ido reorientándose con el nombre de General Processing Graphics Processor Unit (GPGPU). A diferencia de los procesadores SMP señalados anteriormente, las GPGPU no son de propósito general y tienen sus complicaciones para uso general debido al límite en la cantidad de memoria que cada placa puede disponer y al tipo de procesamiento paralelo que debe realizar para poder ser productiva su utilización. Los dispositivos de lógica programable, FPGA, son dispositivos capaces de realizar grandes cantidades de operaciones en paralelo, por lo que pueden ser usados para la implementación de algoritmos específicos, aprovechando el paralelismo que estas ofrecen. Su inconveniente viene derivado de la complejidad para la programación y el testing del algoritmo instanciado en el dispositivo. Ante esta diversidad de procesadores paralelos, el objetivo de nuestro trabajo está enfocado en analizar las características especificas que cada uno de estos tienen, y su impacto en la estructura de los algoritmos para que su utilización pueda obtener rendimientos de procesamiento acordes al número de recursos utilizados y combinarlos de forma tal que su complementación sea benéfica. Específicamente, partiendo desde las características del hardware, determinar las propiedades que el algoritmo paralelo debe tener para poder ser acelerado. Las características de los algoritmos paralelos determinará a su vez cuál de estos nuevos tipos de hardware son los mas adecuados para su instanciación. En particular serán tenidos en cuenta el nivel de dependencia de datos, la necesidad de realizar sincronizaciones durante el procesamiento paralelo, el tamaño de datos a procesar y la complejidad de la programación paralela en cada tipo de hardware. Today´s advances in high-performance computing are driven by parallel processing capabilities of available hardware architectures. These architectures enable the acceleration of algorithms when thes ealgorithms are properly parallelized and exploit the specific processing power of the underneath architecture. Most current processors are targeted for general pruposes and integrate several processor cores on a single chip, resulting in what is known as a Symmetric Multiprocessing (SMP) unit. Nowadays even desktop computers make use of multicore processors. Meanwhile, the industry trend is to increase the number of integrated rocessor cores as technology matures. On the other hand, Graphics Processor Units (GPU), originally designed to handle only video processing, have emerged as interesting alternatives to implement algorithm acceleration. Current available GPUs are able to implement from 200 to 400 threads for parallel processing. Scientific computing can be implemented in these hardware thanks to the programability of new GPUs that have been denoted as General Processing Graphics Processor Units (GPGPU).However, GPGPU offer little memory with respect to that available for general-prupose processors; thus, the implementation of algorithms need to be addressed carefully. Finally, Field Programmable Gate Arrays (FPGA) are programmable devices which can implement hardware logic with low latency, high parallelism and deep pipelines. Thes devices can be used to implement specific algorithms that need to run at very high speeds. However, their programmability is harder that software approaches and debugging is typically time-consuming. In this context where several alternatives for speeding up algorithms are available, our work aims at determining the main features of thes architectures and developing the required know-how to accelerate algorithm execution on them. We look at identifying those algorithms that may fit better on a given architecture as well as compleme
Resumo:
The purpose of this research is to examine the main economic, legislative, and socio- cultural factors that are currently influencing the pub trade in Ireland and their specific impact on a sample of publicans in both Galway city and county. In approaching this task the author engaged in a comprehensive literature review on the origin, history and evolution of the Irish pub; examined the socio-cultural and economic role of the public house in Ireland and developed a profile of the Irish pub by undertaking a number of semi-structured interviews with pub owners from the area. In doing so, the author obtained the views and opinions of the publicans on the current state of their businesses, the extent to which patterns of trade have changed over recent years, the challenges and factors currently influencing their trade, the actions they believed to be necessary to promote the trade and address perceived difficulties and how they viewed the future of the pub business within the framework of the current regulatory regime. In light of this research, the author identified a number of key findings and put forward a series of recommendations designed to promote the future success and development of the pub trade in Ireland. The research established that public houses are currently operating under a very unfavourable regulatory framework that has resulted in the serious decline of the trade over the last decade. This decline appears to have coincided initially with the introduction of the ban on smoking in the workplace and was exacerbated further by the advent of more severe drink-driving laws, especially mandatory breath testing. Other unfavourable conditions include the high levels of excise duty, value added tax and local authority commercial rates. In addition to these regulatory factors, the research established that a major impediment to the pub trade is the unfair competition from supermarkets and other off-licence retail outlets and especially to the phenomenon of the below-cost selling of alcohol. The recession has also been a major contributory factor to the decline in the trade as also has been the trend towards lifestyle changes and home drinking mirroring the practice in some continental European countries.
Resumo:
The sustained economic growth that has been experienced in the Irish economy in recent years has relied, to a large extent, on the contribution and performance of those industry sectors that possess the ability to provide high-value-added products and services to domestic and international markets. One such contributor has been the Technology sector. However, the performance of this sector relies upon the availability of the necessary capabilities and competencies for Technology companies to remain competitive. The Expert Group on Future Skills Needs have forecasted future skills shortages in this sector. The purpose of this research has been to examine the extent to which Irish Technology companies are taking measures to meet changing skills requirements, through training and development interventions. Survey research methods (in the form of a mail questionnaire, supported by a Web-based questionnaire) have been used to collect information on the expenditure on, and approach to, training and development in these companies, in addition to the methods, techniques and tools/aids that are used to support the delivery of these activities. The contribution of Government intervention has also been examined. The conclusions have been varied. When the activities of the responding companies are considered in isolation, the picture to emerge is primarily positive. Although the expenditure on training and development is slightly lower than that indicated in previous studies, the results vary by company size. Technical employees are clearly the key focus of training provision, while Senior Managers and Directors, Clerical and Administrative staff and Manual workers are a great deal more neglected in training provision. Expenditure on, and use of, computer-based training methods is high, as is the use of most of the specified techniques for facilitating learning. However, when one considers the extent to which external support (in the form of Government interventions and cooperation with other companies and with education and training providers) is integrated into the overall training practices of these companies, significant gaps in practice are identified. The thesis concludes by providing a framework to guide future training and development practices in the Technology sector.
Resumo:
The purpose of this study was to evaluate the determinism of the AS-lnterface network and the 3 main families of control systems, which may use it, namely PLC, PC and RTOS. During the course of this study the PROFIBUS and Ethernet field level networks were also considered in order to ensure that they would not introduce unacceptable latencies into the overall control system. This research demonstrated that an incorrectly configured Ethernet network introduces unacceptable variable duration latencies into the control system, thus care must be exercised if the determinism of a control system is not to be compromised. This study introduces a new concept of using statistics and process capability metrics in the form of CPk values, to specify how suitable a control system is for a given control task. The PLC systems, which were tested, demonstrated extremely deterministic responses, but when a large number of iterations were introduced in the user program, the mean control system latency was much too great for an AS-I network. Thus the PLC was found to be unsuitable for an AS-I network if a large, complex user program Is required. The PC systems, which were tested were non-deterministic and had latencies of variable duration. These latencies became extremely exaggerated when a graphing ActiveX was included in the control application. These PC systems also exhibited a non-normal frequency distribution of control system latencies, and as such are unsuitable for implementation with an AS-I network. The RTOS system, which was tested, overcame the problems identified with the PLC systems and produced an extremely deterministic response, even when a large number of iterations were introduced in the user program. The RTOS system, which was tested, is capable of providing a suitable deterministic control system response, even when an extremely large, complex user program is required.
Resumo:
The impending introduction of lead-free solder in the manufacture of electrical and electronic products has presented the electronics industry with many challenges. European manufacturers must transfer from a tin-lead process to a lead-free process by July 2006 as a result of the publication of two directives from the European Parliament. Tin-lead solders have been used for mechanical and electrical connections on printed circuit boards for over fifty years and considerable process knowledge has been accumulated. Extensive literature reviews were conducted on the topic and as a result it was found there are many implications to be considered with the introduction of lead-free solder. One particular question that requires answering is; can lead-free solder be used in existing manufacturing processes? The purpose of this research is to conduct a comparative study of a tin-lead solder and a lead-free solder in two key surface mount technology (SMT) processes. The two SMT processes in question were the stencil printing process and the reflow soldering process. Unreplicated fractional factorial experimental designs were used to carry out the studies. The quality of paste deposition in terms of height and volume were the characteristics of interest in the stencil printing process. The quality of solder joints produced in the reflow soldering experiment was assessed using x-ray and cross sectional analysis. This provided qualitative data that was then uniquely scored and weighted using a method developed during the research. Nested experimental design techniques were then used to analyse the resulting quantitative data. Predictive models were developed that allowed for the optimisation of both processes. Results from both experiments show that solder joints of comparable quality to those produced using tin-lead solder can be produced using lead-free solder in current SMT processes.
Resumo:
In Ireland, although flatfish form a valuable fishery, little is known about the smallest, the dab Limanda limanda. In this study, a variety of parameters of reproductive development, including ovarian phase description, gonadosomatic index (GSI), hepatosomatic index (HSI), relative condition (Kn) and oocyte size were analysed to provide information on the dab’s reproductive cycle and spawning periods. Sampling were collected monthly over an 18-month period using bottom trawls of the Irish coastline. A six phase macroscopic guide was developed for both sexes of dab, and verified using histology. In comparisons of macroscopic and microscopic phases, there was high agreement in the proposed female guide (86%), with males demonstratively lower (62%). No significant bias was observed between the the two reproductive methods. When the male macroscopic guide was examined, misclassification was high in phase 5 and phase 5 (41%), with 96% of misclassification occurring in adjacent phases. The sampled population was primarily composed of females, with ratios of females to males 1:0.6, although the predominance of females was less noticeable during the reproductive season. Oocyte growth in dab follows asynchronous development, and spawn over a protracted period indicating a batch spawning strategy. Spawning occurred mainly in early spring, with total regeneration of gonads by May. The length at which 50% of the population was reproductively mature was identified as 14cm and 17cm, for male and female dab, respectively. Precision and bias in age determinations using whole otoliths to age dab was investigated using six age readers from various institutions. Low levels of precision were obtained (CV: 10-23%) inferring the need for an alternative methodology. Precision and bias was influence by the level of experience of the reader, with ageing error attributed to interpretative differences and difficulty in edge determination. Sectioned otolith age determinations were subsequently compared to whole otolith age determinations using two age readers experienced in dab ageing. Although increased precision was observed in whole otoliths from previous estimates (CV=0%, 0% APE), sectioned otoliths were used for growth models. This was based on multinominal logistic regression on age length keys developed using both ageing methods. Biological data (length and age) for both sexes was applied to four growth models, where the Akaike criterion and Multi model Inference indicated the logistic model as having the best fit to the collected data. In general, female dab attained a longer length then males, with growth rates significantly different between the two sexes. Length weight relationships between the two sexes were also significantly different.
Resumo:
Thalamus, thalamocortical relay neurons, TASK-channels, Two-Pore-K+-channels, HCN-channels, Halothane, Muscarin, Bupivacaine, Spermine, computer modelling
Resumo:
Category, frequency contour, monkey, auditory cortex, neuron, spike