937 resultados para Photography in traffic engineering.
Resumo:
Abstract Background Air pollution in São Paulo is constantly being measured by the State of Sao Paulo Environmental Agency, however there is no information on the variation between places with different traffic densities. This study was intended to identify a gradient of exposure to traffic-related air pollution within different areas in São Paulo to provide information for future epidemiological studies. Methods We measured NO2 using Palmes' diffusion tubes in 36 sites on streets chosen to be representative of different road types and traffic densities in São Paulo in two one-week periods (July and August 2000). In each study period, two tubes were installed in each site, and two additional tubes were installed in 10 control sites. Results Average NO2 concentrations were related to traffic density, observed on the spot, to number of vehicles counted, and to traffic density strata defined by the city Traffic Engineering Company (CET). Average NO2concentrations were 63μg/m3 and 49μg/m3 in the first and second periods, respectively. Dividing the sites by the observed traffic density, we found: heavy traffic (n = 17): 64μg/m3 (95% CI: 59μg/m3 – 68μg/m3); local traffic (n = 16): 48μg/m3 (95% CI: 44μg/m3 – 52μg/m3) (p < 0.001). Conclusion The differences in NO2 levels between heavy and local traffic sites are large enough to suggest the use of a more refined classification of exposure in epidemiological studies in the city. Number of vehicles counted, traffic density observed on the spot and traffic density strata defined by the CET might be used as a proxy for traffic exposure in São Paulo when more accurate measurements are not available.
Resumo:
Patienten, die an Osteosarkom leiden werden derzeit mit intravenös applizierten krebstherapeutischen Mitteln nach Tumorresektion behandelt, was oftmals mit schweren Nebenwirkungen und einem verzögerten Knochenheilungsprozess einhergeht. Darüber hinaus treten vermehrt Rezidive aufgrund von verbleibenden neoplastischen Zellen an der Tumorresektionsstelle auf. Erfolgreiche Knochenregeneration und die Kontrolle von den im Gewebe verbleibenden Krebszellen stellt eine Herausforderung für das Tissue Engineering nach Knochenverlust durch Tumorentfernung dar. In dieser Hinsicht scheint der Einsatz von Hydroxyapatit als Knochenersatzmaterial in Kombination mit Cyclodextrin als Medikamententräger, vielversprechend. Chemotherapeutika können an Biomaterial gebunden und direkt am Tumorbett über einen längeren Zeitraum freigesetzt werden, um verbliebene neoplastische Zellen zu eliminieren. Lokal applizierte Chemotherapie hat diverse Vorteile, einschließlich der direkten zytotoxischen Auswirkung auf lokale Zellen, sowie die Reduzierung schwerer Nebenwirkungen. Diese Studie wurde durchgeführt, um die Funktionsfähigkeit eines solchen Arzneimittelabgabesystems zu bewerten und um Strategien im Bereich des Tissue Engineerings zu entwickeln, die den Knochenheilungsprozess und im speziellen die Vaskularisierung fördern sollen. Die Ergebnisse zeigen, dass nicht nur Krebszellen von der chemotherapeutischen Behandlung betroffen sind. Primäre Endothelzellen wie zum Beispiel HUVEC zeigten eine hohe Sensibilität Cisplatin und Doxorubicin gegenüber. Beide Medikamente lösten in HUVEC ein tumor-unterdrückendes Signal durch die Hochregulation von p53 und p21 aus. Zudem scheint Hypoxie einen krebstherapeutischen Einfluss zu haben, da die Behandlung sensitiver HUVEC mit Hypoxie die Zellen vor Zytotoxizität schützte. Der chemo-protektive Effekt schien deutlich weniger auf Krebszelllinien zu wirken. Diese Resultate könnten eine mögliche chemotherapeutische Strategie darstellen, um den Effekt eines zielgerichteten Medikamenteneinsatzes auf Krebszellen zu verbessern unter gleichzeitiger Schonung gesunder Zellen. Eine erfolgreiche Integration eines Systems, das Arzneimittel abgibt, kombiniert mit einem Biomaterial zur Stabilisierung und Regeneration, könnte gesunden Endothelzellen die Möglichkeit bieten zu proliferieren und Blutgefäße zu bilden, während verbleibende Krebszellen eliminiert werden. Da der Prozess der Knochengeweberemodellierung mit einer starken Beeinträchtigung der Lebensqualität des Patienten einhergeht, ist die Beschleunigung des postoperativen Heilungsprozesses eines der Ziele des Tissue Engineerings. Die Bildung von Blutgefäßen ist unabdingbar für eine erfolgreiche Integration eines Knochentransplantats in das Gewebe. Daher ist ein umfangreich ausgebildetes Blutgefäßsystem für einen verbesserten Heilungsprozess während der klinischen Anwendung wünschenswert. Frühere Experimente zeigen, dass sich die Anwendung von Ko-Kulturen aus humanen primären Osteoblasten (pOB) und humanen outgrowth endothelial cells (OEC) im Hinblick auf die Bildung stabiler gefäßähnlicher Strukturen in vitro, die auch effizient in das mikrovaskuläre System in vivo integriert werden konnten, als erfolgreich erweisen. Dieser Ansatz könnte genutzt werden, um prä-vaskularisierte Konstrukte herzustellen, die den Knochenheilungsprozess nach der Implantation fördern. Zusätzlich repräsentiert das Ko-Kultursystem ein exzellentes in vitro Model, um Faktoren, welche stark in den Prozess der Knochenheilung und Angiogenese eingebunden sind, zu identifizieren und zu analysieren. Es ist bekannt, dass Makrophagen eine maßgebliche Rolle in der inflammatorisch-induzierten Angiogenese spielen. In diesem Zusammenhang hebt diese Studie den positiven Einfluss THP-1 abgeleiteter Makrophagen in Ko-Kultur mit pOB und OEC hervor. Die Ergebnisse zeigten, dass die Anwendung von Makrophagen als inflammatorischer Stimulus im bereits etablierten Ko-Kultursystem zu einer pro-angiogenen Aktivierung der OEC führte, was in einer signifikant erhöhten Bildung blutgefäßähnlicher Strukturen in vitro resultierte. Außerdem zeigte die Analyse von Faktoren, die in der durch Entzündung hervorgerufenen Angiogenese eine wichtige Rolle spielen, eine deutliche Hochregulation von VEGF, inflammatorischer Zytokine und Adhäsionsmoleküle, die letztlich zu einer verstärkten Vaskularisierung beitragen. Diese Resultate werden dem Einfluss von Makrophagen zugeschrieben und könnten zukünftig im Tissue Engineering eingesetzt werden, um den Heilungsprozess zu beschleunigen und damit die klinische Situation von Patienten zu verbessern. Darüber hinaus könnte die Kombination der auf Ko-Kulturen basierenden Ansätze für das Knochen Tissue Engineering mit einem biomaterial-basierenden Arzneimittelabgabesystem zum klinischen Einsatz kommen, der die Eliminierung verbliebener Krebszellen mit der Förderung der Knochenregeneration verbindet.
Resumo:
In traffic accidents with pedestrians, cyclists or motorcyclists, patterned impact injuries as well as marks on clothes can be matched to the injury-causing vehicle structure in order to reconstruct the accident and identify the vehicle which has hit the person. Therefore, the differentiation of the primary impact injuries from other injuries is of great importance. Impact injuries can be identified on the external injuries of the skin, the injured subcutaneous and fat tissue, as well as the fractured bones. Another sign of impact is a bone bruise. The bone bruise, or occult bone lesion, means a bleeding in the subcortical bone marrow, which is presumed to be the result of micro-fractures of the medullar trabeculae. The aim of this study was to prove that bleeding in the subcortical bone marrow of the deceased can be detected using the postmortem noninvasive magnetic resonance imaging. This is demonstrated in five accident cases, four involving pedestrians and one a cyclist, where bone bruises were detected in different bones as a sign of impact occurring in the same location as the external and soft tissue impact injuries.
Resumo:
We describe herein some immunological properties of human fetal bone cells recently tested for bone tissue-engineering applications. Adult mesenchymal stem cells (MSCs) and osteoblasts were included in the study for comparison. Surface markers involved in bone metabolism and immune recognition were analyzed using flow cytometry before and after differentiation or treatment with cytokines. Immunomodulatory properties were studied on activated peripheral blood mononuclear cells (PBMCs). The immuno-profile of fetal bone cells was further investigated at the gene expression level. Fetal bone cells and adult MSCs were positive for Stro-1, alkaline phosphatase, CD10, CD44, CD54, and beta2-microglobulin, but human leukocyte antigen (HLA)-I and CD80 were less present than on adult osteoblasts. All cells were negative for HLA-II. Treatment with recombinant human interferon gamma increased the presence of HLA-I in adult cells much more than in fetal cells. In the presence of activated PBMCs, fetal cells had antiproliferative effects, although with patterns not always comparable with those of adult MSCs and osteoblasts. Because of the immunological profile, and with their more-differentiated phenotype than of stem cells, fetal bone cells present an interesting potential for allogeneic cell source in tissue-engineering applications.
Resumo:
This book provides the latest in a series of books growing out of the International Joint Conferences on Computer, Information and Systems Sciences and Engineering. It includes chapters in the most advanced areas of Computing, Informatics, Systems Sciences and Engineering. It has accessible to a wide range of readership, including professors, researchers, practitioners and students. This book includes a set of rigorously reviewed world-class manuscripts addressing and detailing state-of-the-art research projects in the areas of Computer Science, Informatics, and Systems Sciences, and Engineering. It includes selected papers form the conference proceedings of the Ninth International Joint Conferences on Computer, Information, and Systems Sciences, and Engineering (CISSE 2013). Coverage includes topics in: Industrial Electronics, Technology & Automation, Telecommunications and Networking, Systems, Computing Sciences and Software Engineering, Engineering Education, Instructional Technology, Assessment, and E-learning.
Resumo:
ABSTRACT ONTOLOGIES AND METHODS FOR INTEROPERABILITY OF ENGINEERING ANALYSIS MODELS (EAMS) IN AN E-DESIGN ENVIRONMENT SEPTEMBER 2007 NEELIMA KANURI, B.S., BIRLA INSTITUTE OF TECHNOLOGY AND SCIENCES PILANI INDIA M.S., UNIVERSITY OF MASSACHUSETTS AMHERST Directed by: Professor Ian Grosse Interoperability is the ability of two or more systems to exchange and reuse information efficiently. This thesis presents new techniques for interoperating engineering tools using ontologies as the basis for representing, visualizing, reasoning about, and securely exchanging abstract engineering knowledge between software systems. The specific engineering domain that is the primary focus of this report is the modeling knowledge associated with the development of engineering analysis models (EAMs). This abstract modeling knowledge has been used to support integration of analysis and optimization tools in iSIGHT FD , a commercial engineering environment. ANSYS , a commercial FEA tool, has been wrapped as an analysis service available inside of iSIGHT-FD. Engineering analysis modeling (EAM) ontology has been developed and instantiated to form a knowledge base for representing analysis modeling knowledge. The instances of the knowledge base are the analysis models of real world applications. To illustrate how abstract modeling knowledge can be exploited for useful purposes, a cantilever I-Beam design optimization problem has been used as a test bed proof-of-concept application. Two distinct finite element models of the I-beam are available to analyze a given beam design- a beam-element finite element model with potentially lower accuracy but significantly reduced computational costs and a high fidelity, high cost, shell-element finite element model. The goal is to obtain an optimized I-beam design at minimum computational expense. An intelligent KB tool was developed and implemented in FiPER . This tool reasons about the modeling knowledge to intelligently shift between the beam and the shell element models during an optimization process to select the best analysis model for a given optimization design state. In addition to improved interoperability and design optimization, methods are developed and presented that demonstrate the ability to operate on ontological knowledge bases to perform important engineering tasks. One such method is the automatic technical report generation method which converts the modeling knowledge associated with an analysis model to a flat technical report. The second method is a secure knowledge sharing method which allocates permissions to portions of knowledge to control knowledge access and sharing. Both the methods acting together enable recipient specific fine grain controlled knowledge viewing and sharing in an engineering workflow integration environment, such as iSIGHT-FD. These methods together play a very efficient role in reducing the large scale inefficiencies existing in current product design and development cycles due to poor knowledge sharing and reuse between people and software engineering tools. This work is a significant advance in both understanding and application of integration of knowledge in a distributed engineering design framework.
Resumo:
In this work, a comparison between the competences codes in the CDIÓs* curriculum, the ones defined for the Tunning Project and the International Project Management Association (IPMA) is made. The goal is to define the most appropriate competences codes for the engineering education in Latin America. The CDIO code is obtained from the engineering practice, and responds to the Accreditation Board for Engineering and Technology (ABET) standards of accreditation. The Tuning competences are the ones defined for Latin America and the IPMÁs are international competences for project management. It is the first time that the competences defined in ABET accreditation standards in the engineering field are compared with the international competences according to IPMÁs model. The results give evidence that, in first place, there is a need to apply holistic models in the definition of an engineering curriculum. Second, the pertinence of these models in the definition of engineering programs in Latin America.
Resumo:
The paper describes experiments in automated acquisition of knowledge in traffic problem detection. Preliminary results show that ILP can be used to successfully learn to detect traffic problems.
Resumo:
While ontology engineering is rapidly entering the mainstream, expert ontology engineers are a scarce resource. Hence, there is a need for practical methodologies and technologies, which can assist a variety of user types with ontology development tasks. To address this need, this book presents a scenario-based methodology, the NeOn Methodology, which provides guidance for all main activities in ontology engineering. The context in which we consider these activities is that of a networked world, where reuse of existing resources is commonplace, ontologies are developed collaboratively, and managing relationships between ontologies becomes an essential aspect of the ontological engineering process. The description of both the methodology and the ontology engineering activities is grounded in a comprehensive software environment, the NeOn Toolkit and its plugins, which provides integrated support for all the activities described in the book. Here we provide an introduction for the whole book, while the rest of the content is organized into 4 parts: (1) the NeOn Methodology Framework, (2) the set of ontology engineering activities, (3) the NeOn Toolkit and plugins, and (4) three use cases. Primary goals of this book are (a) to disseminate the results from the NeOn project in a structured and comprehensive form, (b) to make it easier for students and practitioners to adopt ontology engineering methods and tools, and (c) to provide a textbook for undergraduate and postgraduate courses on ontology engineering.
Resumo:
Telecommunications networks have been always expanding and thanks to it, new services have appeared. The old mechanisms for carrying packets have become obsolete due to the new service requirements, which have begun working in real time. Real time traffic requires strict service guarantees. When this traffic is sent through the network, enough resources must be given in order to avoid delays and information losses. When browsing through the Internet and requesting web pages, data must be sent from a server to the user. If during the transmission there is any packet drop, the packet is sent again. For the end user, it does not matter if the webpage loads in one or two seconds more. But if the user is maintaining a conversation with a VoIP program, such as Skype, one or two seconds of delay in the conversation may be catastrophic, and none of them can understand the other. In order to provide support for this new services, the networks have to evolve. For this purpose MPLS and QoS were developed. MPLS is a packet carrying mechanism used in high performance telecommunication networks which directs and carries data using pre-established paths. Now, packets are forwarded on the basis of labels, making this process faster than routing the packets with the IP addresses. MPLS also supports Traffic Engineering (TE). This refers to the process of selecting the best paths for data traffic in order to balance the traffic load between the different links. In a network with multiple paths, routing algorithms calculate the shortest one, and most of the times all traffic is directed through it, causing overload and packet drops, without distributing the packets in the other paths that the network offers and do not have any traffic. But this is not enough in order to provide the real time traffic the guarantees it needs. In fact, those mechanisms improve the network, but they do not make changes in how the traffic is treated. That is why Quality of Service (QoS) was developed. Quality of service is the ability to provide different priority to different applications, users, or data flows, or to guarantee a certain level of performance to a data flow. Traffic is distributed into different classes and each of them is treated differently, according to its Service Level Agreement (SLA). Traffic with the highest priority will have the preference over lower classes, but this does not mean it will monopolize all the resources. In order to achieve this goal, a set policies are defined to control and alter how the traffic flows. Possibilities are endless, and it depends in how the network must be structured. By using those mechanisms it is possible to provide the necessary guarantees to the real-time traffic, distributing it between categories inside the network and offering the best service for both real time data and non real time data. Las Redes de Telecomunicaciones siempre han estado en expansión y han propiciado la aparición de nuevos servicios. Los viejos mecanismos para transportar paquetes se han quedado obsoletos debido a las exigencias de los nuevos servicios, que han comenzado a operar en tiempo real. El tráfico en tiempo real requiere de unas estrictas garantías de servicio. Cuando este tráfico se envía a través de la red, necesita disponer de suficientes recursos para evitar retrasos y pérdidas de información. Cuando se navega por la red y se solicitan páginas web, los datos viajan desde un servidor hasta el usuario. Si durante la transmisión se pierde algún paquete, éste se vuelve a mandar de nuevo. Para el usuario final, no importa si la página tarda uno o dos segundos más en cargar. Ahora bien, si el usuario está manteniendo una conversación usando algún programa de VoIP (como por ejemplo Skype) uno o dos segundos de retardo en la conversación podrían ser catastróficos, y ninguno de los interlocutores sería capaz de entender al otro. Para poder dar soporte a estos nuevos servicios, las redes deben evolucionar. Para este propósito se han concebido MPLS y QoS MPLS es un mecanismo de transporte de paquetes que se usa en redes de telecomunicaciones de alto rendimiento que dirige y transporta los datos de acuerdo a caminos preestablecidos. Ahora los paquetes se encaminan en función de unas etiquetas, lo cual hace que sea mucho más rápido que encaminar los paquetes usando las direcciones IP. MPLS también soporta Ingeniería de Tráfico (TE). Consiste en seleccionar los mejores caminos para el tráfico de datos con el objetivo de balancear la carga entre los diferentes enlaces. En una red con múltiples caminos, los algoritmos de enrutamiento actuales calculan el camino más corto, y muchas veces el tráfico se dirige sólo por éste, saturando el canal, mientras que otras rutas se quedan completamente desocupadas. Ahora bien, esto no es suficiente para ofrecer al tráfico en tiempo real las garantías que necesita. De hecho, estos mecanismos mejoran la red, pero no realizan cambios a la hora de tratar el tráfico. Por esto es por lo que se ha desarrollado el concepto de Calidad de Servicio (QoS). La calidad de servicio es la capacidad para ofrecer diferentes prioridades a las diferentes aplicaciones, usuarios o flujos de datos, y para garantizar un cierto nivel de rendimiento en un flujo de datos. El tráfico se distribuye en diferentes clases y cada una de ellas se trata de forma diferente, de acuerdo a las especificaciones que se indiquen en su Contrato de Tráfico (SLA). EL tráfico con mayor prioridad tendrá preferencia sobre el resto, pero esto no significa que acapare la totalidad de los recursos. Para poder alcanzar estos objetivos se definen una serie de políticas para controlar y alterar el comportamiento del tráfico. Las posibilidades son inmensas dependiendo de cómo se quiera estructurar la red. Usando estos mecanismos se pueden proporcionar las garantías necesarias al tráfico en tiempo real, distribuyéndolo en categorías dentro de la red y ofreciendo el mejor servicio posible tanto a los datos en tiempo real como a los que no lo son.
Resumo:
A major challenge in the engineering of complex and critical systems is the management of change, both in the system and in its operational environment. Due to the growing of complexity in systems, new approaches on autonomy must be able to detect critical changes and avoid their progress towards undesirable states. We are searching for methods to build systems that can tune the adaptability protocols. New mechanisms that use system-wellness requirements to reduce the influence of the outer domain and transfer the control of uncertainly to the inner one. Under the view of cognitive systems, biological emotions suggests a strategy to configure value-based systems to use semantic self-representations of the state. A method inspired by emotion theories to causally connect to the inner domain of the system and its objectives of wellness, focusing on dynamically adapting the system to avoid the progress of critical states. This method shall endow the system with a transversal mechanism to monitor its inner processes, detecting critical states and managing its adaptivity in order to maintain the wellness goals. The paper describes the current vision produced by this work-in-progress.
Resumo:
The main purpose of this work is to describe the case of an online Java Programming course for engineering students to learn computer programming and to practice other non-technicalabilities: online training, self-assessment, teamwork and use of foreign languages. It is important that students develop confidence and competence in these skills, which will be required later in their professional tasks and/or in other engineering courses (life-long learning). Furthermore, this paper presents the pedagogical methodology, the results drawn from this experience and an objective performance comparison with another conventional (face-to-face) Java course.
Resumo:
This document contains detailed description of the design and the implementation of a multi-agent application controlling traffic lights in a city together with a system for simulating traffic and testing. The goal of this thesis is to design and build a simplified intelligent and distributed solution to the problem with the traffic in the big cities following different good practices in order to allow future refining of the model of the real world. The problem of the traffic in the big cities is still a problem that cannot be solved. Not only is the increasing number of cars a reason for the traffic jams, but also the way the traffic is organized. Usually, the intersections with traffic lights are replaced by roundabouts or interchanges to increase the number of cars that can cross the intersection in certain time. But still there are places where the infrastructure cannot be changed and the traffic light semaphores are the only way to control the car flows. In real life, the traffic lights have a predefined plan for change or they receive information from a centralized system when and how they have to change. But what if the traffic lights can cooperate and decide on their own when and how to change? Using this problem, the purpose of the thesis is to explore different agent-based software engineering approaches to design and build a non-conventional distributed system. From the software engineering point of view, the goal of the thesis is to apply the knowledge and use the skills, acquired during the various courses of the master program in Software Engineering, while solving a practical and complex problem such as the traffic in the cities.