937 resultados para Best Possible Medication History (BPMH)
Resumo:
This paper describes our participation at PAN 2014 author profiling task. Our idea was to define, develop and evaluate a simple machine learning classifier able to guess the gender and the age of a given user based on his/her texts, which could become part of the solution portfolio of the company. We were interested in finding not the best possible classifier that achieves the highest accuracy, but to find the optimum balance between performance and throughput using the most simple strategy and less dependent of external systems. Results show that our software using Naive Bayes Multinomial with a term vector model representation of the text is ranked quite well among the rest of participants in terms of accuracy.
Resumo:
Este Trabajo Fin de Grado trata de dar respuesta a un problema de movilidad sostenible en el municipio de Madrid. Mediante las herramientas de análisis geoespacial de los Sistemas de Información Geográfica (SIG) se buscan soluciones para la ampliación de la red de estaciones de suministro de combustibles alternativos como el Gas Licuado de Petróleo (GLP), Gas Natural Comprimido (GNC) y electricidad. Los resultados obtenidos determinan las posibles ubicaciones de los nuevos puntos atendiendo a criterios específicos según el tipo de combustible. Estas soluciones procuran que se alcancen las medidas impuestas por las directivas europeas en la materia de las Smart Cities. Además, con este Trabajo se muestran las capacidades de gestión de los SIG en el ámbito urbano y sus posibles aplicaciones. ABSTRACT: This Final Project answers the problem of sustainable mobility in the city of Madrid. By means of geospatial analysis tools of Geographic Information Systems (GIS) solutions are searched to extend the supply stations network for alternative fuels like Liquefied Petroleum Gas (LPG), Compressed Natural Gas (CNG) and electricity. The final results are the best possible locations of new items according to specific criteria depending on the type of fuel. These solutions seek to the measures imposed by the European directives are reached in the field of Smart Cities. In addition, This Final Project shows management capabilities of GIS in urban areas and their possible application.
Resumo:
Formula Racing Team Manager (FRTM) se trata de un juego de un solo jugador, para Android, donde el jugador tendrá como objetivo principal ascender desde la quinta división inicial hasta la primera y lograr allí ganar la clasificación por equipos ante 19 equipos manejados por el sistema. Por el camino tendrá que gestionar una gran cantidad de tareas distintas en el juego, desde la gestión del equipo en sí a la gestión estratégica de las carreras. Para conseguir el objetivo será básico lograr una buena gestión económica, la fuente principal de ganancias son los patrocinadores, pudiendo contar con un total de cuatro simultáneamente. El dinero conseguido se utilizará en mejorar el equipo (empleados, coche y pilotos) lo máximo posible para conseguir mejores resultados en carrera. Hay una gran cantidad de circuitos disponibles, todos reales, combinando circuitos históricos del calendario de Fórmula 1 con actuales y con circuitos otros populares en otras categorías (a destacar la inclusión de carreras de resistencia como las 500 millas de Indianápolis o las 24 horas de Le Mans). Será importante entender bien los parámetros de cada circuito para lograr un buen resultado en todos ellos. La temporada se divide en 20 grandes premios, formado cada uno por tres sesiones (entrenamientos, clasificación y carrera). En los entrenamientos el jugador podrá, durante dos horas, dar todas las vueltas que cree oportuno hasta que su tiempo se agote, para encontrar así la mejor configuración posible para el coche, y obtener los datos de consumos y desgastes que encuentre necesarios para emplearlos en carrera. En la sesión de clasificación (separada en tres rondas), se decidirán las posiciones de salida en carrera Antes de la carrera el jugador deberá decidir qué estrategia utilizar en ella, escogiendo la configuración del coche, los compuestos de neumáticos y las cargas de combustible a utilizar en cada parada. Durante la carrera también podrá cambiar ciertos parámetros en caso de que la situación de carrera no se adapte a sus expectativas, teniendo así un control total de lo sucedido en carrera, como si de un director deportivo de un equipo real de Fórmula 1 se tratase. Durante la carrera, se irán simulando las vueltas cuando el jugador así lo desee y lo indique mediante un reproductor disponible. Posteriormente, al terminar la carrera volverá a predominar la gestión económica del equipo por parte del jugador, teniendo que controlar los desgastes de cada una de las diez piezas distintas del coche para evitar roturas, y volviendo a poder entrenar a pilotos y empleados. El juego está disponible tanto en español como en inglés. ABSTRACT. Formula Racing Team Manager (FRTM) is a single player game, for Android, where the player has the main objective of promoting from the initial fifth division to the first one, and winning there the championship against 19 teams managed by the system. On the way, the player will have to manage a different number of tasks in the game, from the team management to the race strategic management. To complete that objective a basic key is to achieve a good economic management, the main source of incomes are the sponsors; being able to have a total of four at the same time. The money received will have to be spent on improving the team (staff, car and drivers) the best as possible to try to achieve even better race results. There are a lot of available circuits throughout the game, all of them real, combining some historical from Formula 1 calendar with actual ones, and also with some popular circuits from other categories (to highlight the inclusion of endurance races like the 500miles from Indianapolis and the 24 hours of Le Mans). It will be basic to fully understand the parameters from each circuit to achieve a good result in all of them. The season is divided in 20 Grand Prix, every one of them composed by three sessions (free practice, qualifying and race). In the Free Practice session the player will get the chance to driver all the laps he can in two hours, to try to get the best possible setup for the car and to obtain data from tyres wear and fuel consumption. On the qualifying session (composed by three rounds), the starting grid for the race will be decided. Before the race, the player will have to choose the strategy to use, deciding the car setup, the tyres compound and the fuel inputs for every pit stop to do. Also, throughout the race, the player will get the chance to change some parameters of that strategy in case of the race not going as expected. On the race, every lap will be simulated when the player decides. And, after the race is finished, the player will have to work again on the economy and team management, controlling the wear of every car part to avoid malfunctions, and being able to train drivers and staff. The game is available in both spanish and english.
Resumo:
Poder clasificar de manera precisa la aplicación o programa del que provienen los flujos que conforman el tráfico de uso de Internet dentro de una red permite tanto a empresas como a organismos una útil herramienta de gestión de los recursos de sus redes, así como la posibilidad de establecer políticas de prohibición o priorización de tráfico específico. La proliferación de nuevas aplicaciones y de nuevas técnicas han dificultado el uso de valores conocidos (well-known) en puertos de aplicaciones proporcionados por la IANA (Internet Assigned Numbers Authority) para la detección de dichas aplicaciones. Las redes P2P (Peer to Peer), el uso de puertos no conocidos o aleatorios, y el enmascaramiento de tráfico de muchas aplicaciones en tráfico HTTP y HTTPS con el fin de atravesar firewalls y NATs (Network Address Translation), entre otros, crea la necesidad de nuevos métodos de detección de tráfico. El objetivo de este estudio es desarrollar una serie de prácticas que permitan realizar dicha tarea a través de técnicas que están más allá de la observación de puertos y otros valores conocidos. Existen una serie de metodologías como Deep Packet Inspection (DPI) que se basa en la búsqueda de firmas, signatures, en base a patrones creados por el contenido de los paquetes, incluido el payload, que caracterizan cada aplicación. Otras basadas en el aprendizaje automático de parámetros de los flujos, Machine Learning, que permite determinar mediante análisis estadísticos a qué aplicación pueden pertenecer dichos flujos y, por último, técnicas de carácter más heurístico basadas en la intuición o el conocimiento propio sobre tráfico de red. En concreto, se propone el uso de alguna de las técnicas anteriormente comentadas en conjunto con técnicas de minería de datos como son el Análisis de Componentes Principales (PCA por sus siglas en inglés) y Clustering de estadísticos extraídos de los flujos procedentes de ficheros de tráfico de red. Esto implicará la configuración de diversos parámetros que precisarán de un proceso iterativo de prueba y error que permita dar con una clasificación del tráfico fiable. El resultado ideal sería aquel en el que se pudiera identificar cada aplicación presente en el tráfico en un clúster distinto, o en clusters que agrupen grupos de aplicaciones de similar naturaleza. Para ello, se crearán capturas de tráfico dentro de un entorno controlado e identificando cada tráfico con su aplicación correspondiente, a continuación se extraerán los flujos de dichas capturas. Tras esto, parámetros determinados de los paquetes pertenecientes a dichos flujos serán obtenidos, como por ejemplo la fecha y hora de llagada o la longitud en octetos del paquete IP. Estos parámetros serán cargados en una base de datos MySQL y serán usados para obtener estadísticos que ayuden, en un siguiente paso, a realizar una clasificación de los flujos mediante minería de datos. Concretamente, se usarán las técnicas de PCA y clustering haciendo uso del software RapidMiner. Por último, los resultados obtenidos serán plasmados en una matriz de confusión que nos permitirá que sean valorados correctamente. ABSTRACT. Being able to classify the applications that generate the traffic flows in an Internet network allows companies and organisms to implement efficient resource management policies such as prohibition of specific applications or prioritization of certain application traffic, looking for an optimization of the available bandwidth. The proliferation of new applications and new technics in the last years has made it more difficult to use well-known values assigned by the IANA (Internet Assigned Numbers Authority), like UDP and TCP ports, to identify the traffic. Also, P2P networks and data encapsulation over HTTP and HTTPS traffic has increased the necessity to improve these traffic analysis technics. The aim of this project is to develop a number of techniques that make us able to classify the traffic with more than the simple observation of the well-known ports. There are some proposals that have been created to cover this necessity; Deep Packet Inspection (DPI) tries to find signatures in the packets reading the information contained in them, the payload, looking for patterns that can be used to characterize the applications to which that traffic belongs; Machine Learning procedures work with statistical analysis of the flows, trying to generate an automatic process that learns from those statistical parameters and calculate the likelihood of a flow pertaining to a certain application; Heuristic Techniques, finally, are based in the intuition or the knowledge of the researcher himself about the traffic being analyzed that can help him to characterize the traffic. Specifically, the use of some of the techniques previously mentioned in combination with data mining technics such as Principal Component Analysis (PCA) and Clustering (grouping) of the flows extracted from network traffic captures are proposed. An iterative process based in success and failure will be needed to configure these data mining techniques looking for a reliable traffic classification. The perfect result would be the one in which the traffic flows of each application is grouped correctly in each cluster or in clusters that contain group of applications of similar nature. To do this, network traffic captures will be created in a controlled environment in which every capture is classified and known to pertain to a specific application. Then, for each capture, all the flows will be extracted. These flows will be used to extract from them information such as date and arrival time or the IP length of the packets inside them. This information will be then loaded to a MySQL database where all the packets defining a flow will be classified and also, each flow will be assigned to its specific application. All the information obtained from the packets will be used to generate statistical parameters in order to describe each flow in the best possible way. After that, data mining techniques previously mentioned (PCA and Clustering) will be used on these parameters making use of the software RapidMiner. Finally, the results obtained from the data mining will be compared with the real classification of the flows that can be obtained from the database. A Confusion Matrix will be used for the comparison, letting us measure the veracity of the developed classification process.
Resumo:
Dentro del análisis y diseño estructural surgen frecuentemente problemas de ingeniería donde se requiere el análisis dinámico de grandes modelos de elementos finitos que llegan a millones de grados de libertad y emplean volúmenes de datos de gran tamaño. La complejidad y dimensión de los análisis se dispara cuando se requiere realizar análisis paramétricos. Este problema se ha abordado tradicionalmente desde diversas perspectivas: en primer lugar, aumentando la capacidad tanto de cálculo como de memoria de los sistemas informáticos empleados en los análisis. En segundo lugar, se pueden simplificar los análisis paramétricos reduciendo su número o detalle y por último se puede recurrir a métodos complementarios a los elementos .nitos para la reducción de sus variables y la simplificación de su ejecución manteniendo los resultados obtenidos próximos al comportamiento real de la estructura. Se propone el empleo de un método de reducción que encaja en la tercera de las opciones y consiste en un análisis simplificado que proporciona una solución para la respuesta dinámica de una estructura en el subespacio modal complejo empleando un volumen de datos muy reducido. De este modo se pueden realizar análisis paramétricos variando múltiples parámetros, para obtener una solución muy aproximada al objetivo buscado. Se propone no solo la variación de propiedades locales de masa, rigidez y amortiguamiento sino la adición de grados de libertad a la estructura original para el cálculo de la respuesta tanto permanente como transitoria. Adicionalmente, su facilidad de implementación permite un control exhaustivo sobre las variables del problema y la implementación de mejoras como diferentes formas de obtención de los autovalores o la eliminación de las limitaciones de amortiguamiento en la estructura original. El objetivo del método se puede considerar similar a los que se obtienen al aplicar el método de Guyan u otras técnicas de reducción de modelos empleados en dinámica estructural. Sin embargo, aunque el método permite ser empleado en conjunción con otros para obtener las ventajas de ambos, el presente procedimiento no realiza la condensación del sistema de ecuaciones, sino que emplea la información del sistema de ecuaciones completa estudiando tan solo la respuesta en las variables apropiadas de los puntos de interés para el analista. Dicho interés puede surgir de la necesidad de obtener la respuesta de las grandes estructuras en unos puntos determinados o de la necesidad de modificar la estructura en zonas determinadas para cambiar su comportamiento (respuesta en aceleraciones, velocidades o desplazamientos) ante cargas dinámicas. Por lo tanto, el procedimiento está particularmente indicado para la selección del valor óptimo de varios parámetros en grandes estructuras (del orden de cientos de miles de modos) como pueden ser la localización de elementos introducidos, rigideces, masas o valores de amortiguamientos viscosos en estudios previos en los que diversas soluciones son planteadas y optimizadas, y que en el caso de grandes estructuras, pueden conllevar un número de simulaciones extremadamente elevado para alcanzar la solución óptima. Tras plantear las herramientas necesarias y desarrollar el procedimiento, se propone un caso de estudio para su aplicación al modelo de elementos .nitos del UAV MILANO desarrollado por el Instituto Nacional de Técnica Aeroespacial. A dicha estructura se le imponen ciertos requisitos al incorporar un equipo en aceleraciones en punta de ala izquierda y desplazamientos en punta de ala derecha en presencia de la sustentación producida por una ráfaga continua de viento de forma sinusoidal. La modificación propuesta consiste en la adición de un equipo en la punta de ala izquierda, bien mediante un anclaje rígido, bien unido mediante un sistema de reducción de la respuesta dinámica con propiedades de masa, rigidez y amortiguamiento variables. El estudio de los resultados obtenidos permite determinar la optimización de los parámetros del sistema de atenuación por medio de múltiples análisis dinámicos de forma que se cumplan de la mejor forma posible los requisitos impuestos con la modificación. Se comparan los resultados con los obtenidos mediante el uso de un programa comercial de análisis por el método de los elementos .nitos lográndose soluciones muy aproximadas entre el modelo completo y el reducido. La influencia de diversos factores como son el amortiguamiento modal de la estructura original, el número de modos retenidos en la truncatura o la precisión proporcionada por el barrido en frecuencia se analiza en detalle para, por último, señalar la eficiencia en términos de tiempo y volumen de datos de computación que ofrece el método propuesto en comparación con otras aproximaciones. Por lo tanto, puede concluirse que el método propuesto se considera una opción útil y eficiente para el análisis paramétrico de modificaciones locales en grandes estructuras. ABSTRACT When developing structural design and analysis some projects require dynamic analysis of large finite element models with millions of degrees of freedom which use large size data .les. The analysis complexity and size grow if a parametric analysis is required. This problem has been approached traditionally in several ways: one way is increasing the power and the storage capacity of computer systems involved in the analysis. Other obvious way is reducing the total amount of analyses and their details. Finally, complementary methods to finite element analysis can also be employed in order to limit the number of variables and to reduce the execution time keeping the results as close as possible to the actual behaviour of the structure. Following this third option, we propose a model reduction method that is based in a simplified analysis that supplies a solution for the dynamic response of the structure in the complex modal space using few data. Thereby, parametric analysis can be done varying multiple parameters so as to obtain a solution which complies with the desired objetive. We propose not only mass, stiffness and damping variations, but also addition of degrees of freedom to the original structure in order to calculate the transient and steady-state response. Additionally, the simple implementation of the procedure allows an in-depth control of the problem variables. Furthermore, improvements such as different ways to obtain eigenvectors or to remove damping limitations of the original structure are also possible. The purpose of the procedure is similar to that of using the Guyan or similar model order reduction techniques. However, in our method we do not perform a true model order reduction in the traditional sense. Furthermore, additional gains, which we do not explore herein, can be obtained through the combination of this method with traditional model-order reduction procedures. In our procedure we use the information of the whole system of equations is used but only those nodes of interest to the analyst are processed. That interest comes from the need to obtain the response of the structure at specific locations or from the need to modify the structure at some suitable positions in order to change its behaviour (acceleration, velocity or displacement response) under dynamic loads. Therefore, the procedure is particularly suitable for parametric optimization in large structures with >100000 normal modes such as position of new elements, stiffness, mass and viscous dampings in previous studies where different solutions are devised and optimized, and in the case of large structures, can carry an extremely high number of simulations to get the optimum solution. After the introduction of the required tools and the development of the procedure, a study case is proposed with use the finite element model (FEM) of the MILANO UAV developed by Instituto Nacional de Técnica Aeroespacial. Due to an equipment addition, certain acceleration and displacement requirements on left wing tip and right wing tip, respectively, are imposed. The structure is under a continuous sinusoidal wind gust which produces lift. The proposed modification consists of the addition of an equipment in left wing tip clamped through a rigid attachment or through a dynamic response reduction system with variable properties of mass, stiffness and damping. The analysis of the obtained results allows us to determine the optimized parametric by means of multiple dynamic analyses in a way such that the imposed requirements have been accomplished in the best possible way. The results achieved are compared with results from a commercial finite element analysis software, showing a good correlation. Influence of several factors such as the modal damping of the original structure, the number of modes kept in the modal truncation or the precission given by the frequency sweep is analyzed. Finally, the efficiency of the proposed method is addressed in tems of computational time and data size compared with other approaches. From the analyses performed, we can conclude that the proposed method is a useful and efficient option to perform parametric analysis of possible local modifications in large structures.
Resumo:
In data assimilation, one prepares the grid data as the best possible estimate of the true initial state of a considered system by merging various measurements irregularly distributed in space and time, with a prior knowledge of the state given by a numerical model. Because it may improve forecasting or modeling and increase physical understanding of considered systems, data assimilation now plays a very important role in studies of atmospheric and oceanic problems. Here, three examples are presented to illustrate the use of new types of observations and the ability of improving forecasting or modeling.
Resumo:
O presente trabalho tem como foco o estudo e análise das alianças estratégicas realizadas entre empresas do setor do real estate, no período entre 2006 e 2010, enfatizando as alianças realizadas entre empresas atuantes predominantemente nas cidades de São Paulo e Rio de Janeiro com empresas atuantes nas regiões Norte e Nordeste do Brasil, cujo objetivo por parte das empresas paulistas foi o espalhamento geográfico. Considerando o volume representativo de alianças estratégicas verificado no setor no período em questão e a geração de resultados dos empreendimentos objetos de tais parcerias inferiores às expectativas estabelecidas, o objetivo do trabalho é a apresentação de um conjunto de diretrizes que possa contribuir para o planejamento, realização e condução de futuras parcerias entre empresas do setor, visando a mitigar dificuldades e a explorar da melhor forma possível os benefícios que as alianças estratégicas podem proporcionar. Para tanto, realizou-se uma pesquisa por meio de um estudo de casos múltiplos abrangendo o estudo de empresas de capital aberto que atuavam predominantemente nas capitais do eixo Rio-São Paulo, empresas atuantes em nível regional no Norte e Nordeste brasileiro que realizaram parcerias com as empresas do Sudeste, além de empresas de consultoria que estiveram envolvidas nas parcerias por meio de prestação de serviços. Tal pesquisa permitiu identificar as principais dificuldades, vantagens e desvantagens decorrentes das parcerias em questão, cujos dados foram analisados e discutidos à luz da revisão bibliográfica, embasando assim o conjunto de diretrizes proposto. As diretrizes apresentadas visam a contribuir com todo o processo que envolve a realização de uma parceria, contemplando desde aspectos de planejamento, gestão até aspectos operacionais e são complementadas por recomendações que somadas às diretrizes podem elevar a probabilidade de êxito das parcerias.
Resumo:
Desde a década de 1980 diversos autores apresentaram correlações entre provas de carga estática e ensaios de carregamento dinâmico em estacas. Para uma boa correlação é fundamental que os testes sejam bem executados e que atinjam a ruptura segundo algum critério, como o de Davisson, por exemplo, além de levar em conta o intervalo de tempo entre a execução da prova de carga estática e do ensaio dinâmico, face ao efeito \"set up\". Após a realização do ensaio dinâmico realiza-se a análise CAPWAP que permite a determinação da distribuição do atrito lateral em profundidade, a carga de ponta e outros parâmetros dos solos tais como quakes e damping. A análise CAPWAP é realizada por tentativas através do procedimento \"signal matching\", isto é, o melhor ajuste entre os sinais de força medido pelos sensores e o calculado. É relativamente fácil mostrar que a mesma solução pode ser obtida através de dados de entrada diferentes. Isso significa que apesar de apresentarem cargas mobilizadas próximas o formato da curva da simulação de prova de carga estática, obtida pelo CAPWAP, assim como a distribuição do atrito lateral, podem ser diferentes, mesmo que as análises apresentem \"match quality\" (MQWU) satisfatórios. Uma forma de corrigir o formato da curva simulada do CAPWAP, assim como a distribuição do atrito lateral, é através da comparação com provas de carga estática (PCE). A sobreposição das duas curvas, a simulada e a \"real\", permite a determinação do quake do fuste através do trecho inicial da curva carga-recalque da prova de carga estática, que por sua vez permite uma melhor definição da distribuição do atrito lateral e da reação de ponta. Neste contexto surge o conceito de \"match quality de recalques\" (MQR). Quando a PCE não está disponível, propõe-se efetuar um carregamento estático utilizando o peso próprio do martelo do bate-estaca (CEPM). Mostra-se, através de dois casos de obra, em que estavam disponíveis ensaios de carregamento dinâmico e PCEs, que esse procedimento permite obter uma melhor solução do ponto de vista físico, isto é consistente com as características do subsolo e com a curva carga-recalque da PCE, e não apenas matemático, através da avaliação do \"match quality\" (MQWU).
Resumo:
Se recogen, en la primera parte de la ponencia, algunas de las transformaciones que ha conocido la arqueología en los últimos decenios: la ampliación de los límites cronológicos hasta umbrales contemporáneos, la caída del enfoque tradicional que concebía la arqueología como una disciplina ocupada básicamente en trabajar bajo cota 0 y la ampliación de escala del objeto de estudio desde la cultura material mueble y el yacimiento hacia el territorio. En una segunda parte se hace un esfuerzo por redefinir el perfil de la arqueología en la actualidad llevando a cabo una reflexión fundamentalmente ontológica y axiológica, sin renunciar sin embargo a algunas consideraciones de carácter epistemológico. Para terminar, se presenta la “cadena de valor” como herramienta metodológica que sirve mejor que ninguna otra al tratamiento del Patrimonio desde una perspectiva integral e interdisciplinaria.
Resumo:
The functions of the financial system of a developed economy are often badly understood. This can largely be attributed to free-market ideology, which has spread the belief that leaving finance to its own devices would provide the best possible mechanism for allocating savings. The latest financial crisis has sparked the beginnings of a new awareness on this point, but it is far from having led to an improved understanding of the role of the financial institutions. For many people, finance remains more an enemy to be resisted than an instrument to be intelligently exploited. Its institutions, which issue and circulate money, play an important role in the working of the real economy that it would be imprudent to neglect. The allocation of savings, but also the level of activity and the growth rate depend on it. In this book, the authors carefully analyse the close links between money, finance and the real economy. In the process, they show why today the existence of a substantial potential of saving, instead of being an opportunity for the world economy, could threaten it with ‘secular stagnation’.
Resumo:
Many different methods of reporting animal diets have been used in ecological research. These vary greatly in level of accuracy and precision and therefore complicate attempts to measure and compare diets, and quantitites of nutrients in those diets, across a wide range of taxa. For most birds, the carotenoid content of the diet has not been directly measured. Here, therefore, I use an avian example to show how different methods of measuring the quantities of various foods in the diet affect the relative rankings of higher taxa (families, subfamilies, and tribes), and species within these taxa, with regard to the carotenoid contents of their diets. This is a timely example, as much recent avian literature has focused on the way dietary carotenoids may be traded off among aspects of survival, fitness and signalling. I assessed the mean dietary carotenoid contents of representatives of thirty higher taxa of birds using four different carotenoid intake indices varying in precision, including trophic levels, a coarse-scale and a fine-scale categorical index, and quantitative estimates of dietary carotenoids. This last method was used as the benchmark. For comparisons among taxa, all but the trophic level index were significantly correlated with each other. However, for comparisons of species within taxa, the fine-scale index outperformed the coarse-scale index, which in turn outperformed the trophic level index. In addition, each method has advantages and disadvantages, as well as underlying assumptions that must be considered. Examination and comparison of several possible methods of diet assessment appears to highlight these so that the best possible index is used given available data, and it is recommended that such a step be taken prior to the inclusion of estimated nutrient intake in any statistical analysis. Although applied to avian carotenoids here, this method could readily be applied to other taxa and types of nutrients.
Resumo:
Estudo sobre as relações comunicacionais entre a multinacional BASF e a comunidade no seu entorno na cidade de Guaratinguetá, no interior de São Paulo. O objetivo geral é analisar em que condições uma empresa multinacional e uma comunidade ao seu redor podem concretizar na prática uma parceria em busca de uma gestão socialmente mais responsável, bem como verificar qual o poder de interferência dessa comunidade em seus processos de comunicação. A metodologia utilizada foi estudo de caso por meio de análise de documentos, observação participante em reunião do Conselho Comunitário, grupo de discussão e entrevistas semiestruturadas, além de pesquisa bibliográfica. Procuramos compreender a percepção de membros do Conselho, de moradores da cidade, da mídia local e também dos profissionais de comunicação da multinacional. A principal conclusão é que o quepoderia se constituir em uma parceria de sucesso para ambos os lados, acaba se restringindo a uma mera relação social, onde os dois lados parecem se sentir bem e estão convictos de que realizam seus trabalhos da melhor maneira possível.
Resumo:
This thesis describes the design and synthesis of a variety of functionalised phosphine oxides and sulfides, based on the structure of trioctylphosphine oxide, synthesised for the purpose of surface modification of quantum dots. The ability of the ligands to modify the surface chemistry via displacement of the original hexadecylamine capping layer of quantum dots was evaluated. Finally the surface modified quantum dots were investigated for enhancement in their inherent properties and improved compatibility with the various applications for which they were initially designed. Upon the commencement of research involving quantum dots it became apparent that more information on their behaviour and interaction with the environment was required. The limits of the inherent stability of hexadecylamine capped quantum dots were investigated by exposure to a number of different environments. The effect upon the stability of the quantum dots was monitored by changes in the photoluminescence ability of their cores. Subtle differences between different batches of quantum dots were observed and the necessity to account for these in future applications noted. Lastly the displacement of the original hexadecylamine coating with the "designer" functionalised ligands was evaluated to produce a set of conditions that would result in the best possible surface modification. A general procedure was elucidated however it was discovered that each displacement still required slight adjustment by consideration of the other factors such as the difference in ligand structure and the individuality of the various batches of quantum dots. This thesis also describes a procedure for the addition of a protective layer to the surface of quantum dots by cross-linking the functionalised ligands bound to the surface via an acyclic diene metathesis polymerisation. A detailed description of the problems encountered in the analysis of these materials combined with the use of novel techniques such as diffusion ordered spectroscopy is provided as a means to overcome the limitations encountered. Finally a demonstration of the superior stability, upon exposure to a range of aggressive environments of these protected materials compared with those before cross-linking provided physical proof of the cross-linking process and the advantages of the cross-linking modification. Finally this thesis includes the presentation of initial work into the production of luminescent nanocrystal encoded resin beads for the specific use in solid phase combinatorial chemistry. Demonstration of the successful covalent incorporation of quantum dots into the polymeric matrices of non-functionalised and functionalised resin beads is described. Finally by preliminary work to address and overcome the possible limitations that may be encountered in the production and general employment of these materials in combinatorial techniques is given.
Resumo:
A firm favourite with students and lecturers alike, Intellectual Property can be trusted to equip you with the best possible basis for study of this dynamic subject. Providing an unrivalled account of the law in this area, this book also examines the ethical and policy influences which have shaped its development providing you with a solid basis for further exploration of the subject.
Resumo:
Background - Delivery of high-quality, evidence-based health care to deprived sectors of the community is a major goal for society. We investigated the effectiveness of a culturally sensitive, enhanced care package in UK general practices for improvement of cardiovascular risk factors in patients of south Asian origin with type 2 diabetes. Methods - In this cluster randomised controlled trial, 21 inner-city practices in the UK were assigned by simple randomisation to intervention (enhanced care including additional time with practice nurse and support from a link worker and diabetes-specialist nurse [nine practices; n=868]) or control (standard care [12 practices; n=618]) groups. All adult patients of south Asian origin with type 2 diabetes were eligible. Prescribing algorithms with clearly defined targets were provided for all practices. Primary outcomes were changes in blood pressure, total cholesterol, and glycaemic control (haemoglobin A1c) after 2 years. Analysis was by intention to treat. This trial is registered, number ISRCTN 38297969. Findings - We recorded significant differences between treatment groups in diastolic blood pressure (1·91 [95% CI -2·88 to -0·94] mm?Hg, p=0·0001) and mean arterial pressure (1·36 [-2·49 to -0·23] mm?Hg, p=0·0180), after adjustment for confounders and clustering. We noted no significant differences between groups for total cholesterol (0·03 [-0·04 to 0·11] mmol/L), systolic blood pressure (-0·33 [-2·41 to 1·75] mm?Hg), or HbA1c (-0·15% [-0·33 to 0·03]). Economic analysis suggests that the nurse-led intervention was not cost effective (incremental cost-effectiveness ratio £28?933 per QALY gained). Across the whole study population over the 2 years of the trial, systolic blood pressure, diastolic blood pressure, and cholesterol decreased significantly by 4·9 (95% CI 4·0–5·9) mm?Hg, 3·8 (3·2–4·4) mm?Hg, and 0·45 (0·40–0·51) mmol/L, respectively, and we recorded a small and non-significant increase for haemoglobin A1c (0·04% [-0·04 to 0·13]), p=0·290). Interpretation - We recorded additional, although small, benefits from our culturally tailored care package that were greater than the secular changes achieved in the UK in recent years. Stricter targets in general practice and further measures to motivate patients are needed to achieve best possible health-care outcomes in south Asian patients with diabetes. Funding - Pfizer, Sanofi-Aventis, Servier Laboratories UK, Merck Sharp & Dohme/Schering-Plough, Takeda UK, Roche, Merck Pharma, Daiichi-Sankyo UK, Boehringer Ingelheim, Eli Lilly, Novo Nordisk, Bristol-Myers Squibb, Solvay Health Care, and Assurance Medical Society UK.