908 resultados para Traffic control devices.
Resumo:
Die thermische Verarbeitung von Lebensmitteln beeinflusst deren Qualität und ernährungsphysiologischen Eigenschaften. Im Haushalt ist die Überwachung der Temperatur innerhalb des Lebensmittels sehr schwierig. Zudem ist das Wissen über optimale Temperatur- und Zeitparameter für die verschiedenen Speisen oft unzureichend. Die optimale Steuerung der thermischen Zubereitung ist maßgeblich abhängig von der Art des Lebensmittels und der äußeren und inneren Temperatureinwirkung während des Garvorgangs. Das Ziel der Arbeiten war die Entwicklung eines automatischen Backofens, der in der Lage ist, die Art des Lebensmittels zu erkennen und die Temperatur im Inneren des Lebensmittels während des Backens zu errechnen. Die für die Temperaturberechnung benötigten Daten wurden mit mehreren Sensoren erfasst. Hierzu kam ein Infrarotthermometer, ein Infrarotabstandssensor, eine Kamera, ein Temperatursensor und ein Lambdasonde innerhalb des Ofens zum Einsatz. Ferner wurden eine Wägezelle, ein Strom- sowie Spannungs-Sensor und ein Temperatursensor außerhalb des Ofens genutzt. Die während der Aufheizphase aufgenommen Datensätze ermöglichten das Training mehrerer künstlicher neuronaler Netze, die die verschiedenen Lebensmittel in die entsprechenden Kategorien einordnen konnten, um so das optimale Backprogram auszuwählen. Zur Abschätzung der thermische Diffusivität der Nahrung, die von der Zusammensetzung (Kohlenhydrate, Fett, Protein, Wasser) abhängt, wurden mehrere künstliche neuronale Netze trainiert. Mit Ausnahme des Fettanteils der Lebensmittel konnten alle Komponenten durch verschiedene KNNs mit einem Maximum von 8 versteckten Neuronen ausreichend genau abgeschätzt werden um auf deren Grundlage die Temperatur im inneren des Lebensmittels zu berechnen. Die durchgeführte Arbeit zeigt, dass mit Hilfe verschiedenster Sensoren zur direkten beziehungsweise indirekten Messung der äußeren Eigenschaften der Lebensmittel sowie KNNs für die Kategorisierung und Abschätzung der Lebensmittelzusammensetzung die automatische Erkennung und Berechnung der inneren Temperatur von verschiedensten Lebensmitteln möglich ist.
Resumo:
Since no physical system can ever be completely isolated from its environment, the study of open quantum systems is pivotal to reliably and accurately control complex quantum systems. In practice, reliability of the control field needs to be confirmed via certification of the target evolution while accuracy requires the derivation of high-fidelity control schemes in the presence of decoherence. In the first part of this thesis an algebraic framework is presented that allows to determine the minimal requirements on the unique characterisation of arbitrary unitary gates in open quantum systems, independent on the particular physical implementation of the employed quantum device. To this end, a set of theorems is devised that can be used to assess whether a given set of input states on a quantum channel is sufficient to judge whether a desired unitary gate is realised. This allows to determine the minimal input for such a task, which proves to be, quite remarkably, independent of system size. These results allow to elucidate the fundamental limits regarding certification and tomography of open quantum systems. The combination of these insights with state-of-the-art Monte Carlo process certification techniques permits a significant improvement of the scaling when certifying arbitrary unitary gates. This improvement is not only restricted to quantum information devices where the basic information carrier is the qubit but it also extends to systems where the fundamental informational entities can be of arbitary dimensionality, the so-called qudits. The second part of this thesis concerns the impact of these findings from the point of view of Optimal Control Theory (OCT). OCT for quantum systems utilises concepts from engineering such as feedback and optimisation to engineer constructive and destructive interferences in order to steer a physical process in a desired direction. It turns out that the aforementioned mathematical findings allow to deduce novel optimisation functionals that significantly reduce not only the required memory for numerical control algorithms but also the total CPU time required to obtain a certain fidelity for the optimised process. The thesis concludes by discussing two problems of fundamental interest in quantum information processing from the point of view of optimal control - the preparation of pure states and the implementation of unitary gates in open quantum systems. For both cases specific physical examples are considered: for the former the vibrational cooling of molecules via optical pumping and for the latter a superconducting phase qudit implementation. In particular, it is illustrated how features of the environment can be exploited to reach the desired targets.
Resumo:
One of the most prominent industrial applications of heat transfer science and engineering has been electronics thermal control. Driven by the relentless increase in spatial density of microelectronic devices, integrated circuit chip powers have risen by a factor of 100 over the past twenty years, with a somewhat smaller increase in heat flux. The traditional approaches using natural convection and forced-air cooling are becoming less viable as power levels increase. This paper provides a high-level overview of the thermal management problem from the perspective of a practitioner, as well as speculation on the prospects for electronics thermal engineering in years to come.
Resumo:
TCP flows from applications such as the web or ftp are well supported by a Guaranteed Minimum Throughput Service (GMTS), which provides a minimum network throughput to the flow and, if possible, an extra throughput. We propose a scheme for a GMTS using Admission Control (AC) that is able to provide different minimum throughput to different users and that is suitable for "standard" TCP flows. Moreover, we consider a multidomain scenario where the scheme is used in one of the domains, and we propose some mechanisms for the interconnection with neighbor domains. The whole scheme uses a small set of packet classes in a core-stateless network where each class has a different discarding priority in queues assigned to it. The AC method involves only edge nodes and uses a special probing packet flow (marked as the highest discarding priority class) that is sent continuously from ingress to egress through a path. The available throughput in the path is obtained at the egress using measurements of flow aggregates, and then it is sent back to the ingress. At the ingress each flow is detected using an implicit way and then it is admission controlled. If it is accepted, it receives the GMTS and its packets are marked as the lowest discarding priority classes; otherwise, it receives a best-effort service. The scheme is evaluated through simulation in a simple "bottleneck" topology using different traffic loads consisting of "standard" TCP flows that carry files of varying sizes
Resumo:
Recent developments in optical communications have allowed simpler optical devices to improve network resource utilization. As such, we propose adding a lambda-monitoring device to a wavelength-routing switch (WRS) allowing better performance when traffic is routed and groomed. This device may allow a WRS to aggregate traffic over optical routes without incurring in optical-electrical-optical conversion for the existing traffic. In other words, optical routes can be taken partially to route demands creating a sort of "lighttours". In this paper, we compare the number of OEO conversions needed to route a complete given traffic matrix using either lighttours or lightpaths
Resumo:
El propósito de este estudio es determinar la relación entre la exposición ocupacional y los niveles de audición en trabajadores urbanos en espacio abierto (aseo urbano en general). Se realizó un estudio de corte transversal con 491 personas que incluyen hombres y mujeres, cuyo ambiente laboral es el espacio abierto de la ciudad. Los datos se obtuvieron durante los exámenes médicos periódicos realizados en el año 2014 a los empleados de una empresa cuya actividad económica es el aseo urbano, que incluye recolección de basuras, cuidado forestal y de prados de uso común, y limpieza del borde de los andenes. Se realizó estadística descriptiva para las características demográficas y razón de disparidad u Odds Ratio (OR) para buscar la relación de antecedentes y hábitos personales con el riesgo de desarrollar pérdida auditiva. De las 491 personas expuestas a niveles altos de ruido ocupacional, 62% presentó pérdida auditiva, de los cuales la mayoría se desempeña como guadañadores y cortadores de césped, y son personas que llevan trabajando entre 1-5 años en la empresa. Se encontró un aumento estadísticamente significativo entre la baja escolaridad y el riesgo de sufrir hipoacusia (p=0.0001) y un efecto protector del uso de motocicleta y audífonos. La enfermedad vascular periférica, la práctica de tejo y la diabetes mostraron una fuerte tendencia a aumentar el riesgo. La pérdida auditiva encontrada en este grupo no se puede relacionar directamente con la exposición ocupacional a ruido, a pesar de ser trabajos que se llevan a cabo en el espacio urbano. Sin embargo, la baja escolaridad favorece la lesión auditiva y puede verse acelerada por enfermedades de alta prevalencia como diabetes y practicas recreacionales locales.
Resumo:
Los sistemas tales como edificios y veh¨ªculos est¨¢n sujetos a vibraciones que pueden causar mal funcionamiento, incomodidad o colapso. Para mitigar estas vibraciones, se suelen instalar amortiguadores. Estas estructuras se convierten en sistemas adaptr¨®nicos cuando los amortiguadores son controlables. Esta tesis se enfoca en la soluci¨®n del problema de vibraciones en edificios y veh¨ªculos usando amortiguadores magnetoreol¨®gicos (MR). Estos son unos amortiguadores controlables caracterizados por una din¨¢mica altamente no lineal. Adem¨¢s, los sistemas donde se instalan se caracterizan por la incertidumbre param¨¦trica, la limitaci¨®n de medidas y las perturbaciones desconocidas, lo que obliga al uso de t¨¦cnicas complejas de control. En esta tesis se usan Backstepping, QFT y H2/H¡Þ mixto para resolver el problema. Las leyes de control se verifican mediante simulaci¨®n y experimentaci¨®n.
Resumo:
Les noves tecnologies a la xarxa ens permeten transportar, cada cop més, grans volums d' informació i trànsit de xarxa amb diferents nivells de prioritat. En aquest escenari, on s'ofereix una millor qualitat de servei, les conseqüències d'una fallada en un enllaç o en un node esdevenen més importants. Multiprotocol Lavel Switching (MPLS), juntament amb l'extensió a MPLS generalitzat (GMPLS), proporcionen mecanismes ràpids de recuperació de fallada establint camins, Label Switch Path (LSPs), redundants per ser utilitzats com a camins alternatius. En cas de fallada podrem utilitzar aquests camins per redireccionar el trànsit. El principal objectiu d'aquesta tesi ha estat millorar alguns dels actuals mecanismes de recuperació de fallades MPLS/GMPLS, amb l'objectiu de suportar els requeriments de protecció dels serveis proporcionats per la nova Internet. Per tal de fer aquesta avaluació s'han tingut en compte alguns paràmetres de qualitat de protecció com els temps de recuperació de fallada, les pèrdues de paquets o el consum de recursos. En aquesta tesi presentem una completa revisió i comparació dels principals mètodes de recuperació de fallada basats en MPLS. Aquest anàlisi inclou els mètodes de protecció del camí (backups globals, backups inversos i protecció 1+1), els mètodes de protecció locals i els mètodes de protecció de segments. També s'ha tingut en compte l'extensió d'aquests mecanismes a les xarxes òptiques mitjançant el pla de control proporcionat per GMPLS. En una primera fase d'aquest treball, cada mètode de recuperació de fallades és analitzat sense tenir en compte restriccions de recursos o de topologia. Aquest anàlisi ens dóna una primera classificació dels millors mecanismes de protecció en termes de pèrdues de paquets i temps de recuperació. Aquest primer anàlisi no és aplicable a xarxes reals. Per tal de tenir en compte aquest nou escenari, en una segona fase, s'analitzen els algorismes d'encaminament on sí tindrem en compte aquestes limitacions i restriccions de la xarxa. Es presenten alguns dels principals algorismes d'encaminament amb qualitat de servei i alguna de les principals propostes d'encaminament per xarxes MPLS. La majoria dels actual algorismes d'encaminament no tenen en compte l'establiment de rutes alternatives o utilitzen els mateixos objectius per seleccionar els camins de treball i els de protecció. Per millorar el nivell de protecció introduïm i formalitzem dos nous conceptes: la Probabilitat de fallada de la xarxa i l'Impacte de fallada. Un anàlisi de la xarxa a nivell físic proporciona un primer element per avaluar el nivell de protecció en termes de fiabilitat i disponibilitat de la xarxa. Formalitzem l'impacte d'una fallada, quant a la degradació de la qualitat de servei (en termes de retard i pèrdues de paquets). Expliquem la nostra proposta per reduir la probabilitat de fallada i l'impacte de fallada. Per últim fem una nova definició i classificació dels serveis de xarxa segons els valors requerits de probabilitat de fallada i impacte. Un dels aspectes que destaquem dels resultats d'aquesta tesi és que els mecanismes de protecció global del camí maximitzen la fiabilitat de la xarxa, mentre que les tècniques de protecció local o de segments de xarxa minimitzen l'impacte de fallada. Per tant podem assolir mínim impacte i màxima fiabilitat aplicant protecció local a tota la xarxa, però no és una proposta escalable en termes de consum de recursos. Nosaltres proposem un mecanisme intermig, aplicant protecció de segments combinat amb el nostre model d'avaluació de la probabilitat de fallada. Resumint, aquesta tesi presenta diversos mecanismes per l'anàlisi del nivell de protecció de la xarxa. Els resultats dels models i mecanismes proposats milloren la fiabilitat i minimitzen l'impacte d'una fallada en la xarxa.
Resumo:
In this paper a look is taken at how the use of implant technology can be used to either increase the range of the abilities of a human and/or diminish the effects of a neural illness, such as Parkinson's Disease. The key element is the need for a clear interface linking the human brain directly with a computer. The area of interest here is the use of implant technology, particularly where a connection is made between technology and the human brain and/or nervous system. Pilot tests and experimentation are invariably carried out apriori to investigate the eventual possibilities before human subjects are themselves involved. Some of the more pertinent animal studies are discussed here. The paper goes on to describe human experimentation, in particular that carried out by the author himself, which led to him receiving a neural implant which linked his nervous system bi-directionally with the internet. With this in place neural signals were transmitted to various technological devices to directly control them. In particular, feedback to the brain was obtained from the fingertips of a robot hand and ultrasonic (extra) sensory input. A view is taken as to the prospects for the future, both in the near term as a therapeutic device and in the long term as a form of enhancement.
Resumo:
The deployment of Quality of Service (QoS) techniques involves careful analysis of area including: those business requirements; corporate strategy; and technical implementation process, which can lead to conflict or contradiction between those goals of various user groups involved in that policy definition. In addition long-term change management provides a challenge as these implementations typically require a high-skill set and experience level, which expose organisations to effects such as “hyperthymestria” [1] and “The Seven Sins of Memory”, defined by Schacter and discussed further within this paper. It is proposed that, given the information embedded within the packets of IP traffic, an opportunity exists to augment the traffic management with a machine-learning agent-based mechanism. This paper describes the process by which current policies are defined and that research required to support the development of an application which enables adaptive intelligent Quality of Service controls to augment or replace those policy-based mechanisms currently in use.
Resumo:
The Boltzmann equation in presence of boundary and initial conditions, which describes the general case of carrier transport in microelectronic devices is analysed in terms of Monte Carlo theory. The classical Ensemble Monte Carlo algorithm which has been devised by merely phenomenological considerations of the initial and boundary carrier contributions is now derived in a formal way. The approach allows to suggest a set of event-biasing algorithms for statistical enhancement as an alternative of the population control technique, which is virtually the only algorithm currently used in particle simulators. The scheme of the self-consistent coupling of Boltzmann and Poisson equation is considered for the case of weighted particles. It is shown that particles survive the successive iteration steps.
Resumo:
One of the major aims of BCI research is devoted to achieving faster and more efficient control of external devices. The identification of individual tap events in a motor imagery BCI is therefore a desirable goal. EEG is recorded from subjects performing and imagining finger taps with their left and right hands. A Differential Evolution based feature selection wrapper is used in order to identify optimal features in the spatial and frequency domains for tap identification. Channel-frequency band combinations are found which allow differentiation of tap vs. no-tap control conditions for executed and imagined taps. Left vs. right hand taps may also be differentiated with features found in this manner. A sliding time window is then used to accurately identify individual taps in the executed tap and imagined tap conditions. Highly statistically significant classification accuracies are achieved with time windows of 0.5 s and more allowing taps to be identified on a single trial basis.
Resumo:
Here we present an economical and versatile platform for developing motor control and sensory feedback of a prosthetic hand via in vitro mammalian peripheral nerve activity. In this study, closed-loop control of the grasp function of the prosthetic hand was achieved by stimulation of a peripheral nerve preparation in response to slip sensor data from a robotic hand, forming a rudimentary reflex action. The single degree of freedom grasp was triggered by single unit activity from motor and sensory fibers as a result of stimulation. The work presented here provides a novel, reproducible, economic, and robust platform for experimenting with neural control of prosthetic devices before attempting in vivo implementation.
Resumo:
Wireless local area networks (WLANs) based on the IEEE 802.11 standard are now widespread. Most are used to provide access for mobile devices to a conventional wired infrastructure, and some are used where wires are not possible, forming an ad hoc network of their own. There are several varieties at the physical or radio layer (802.11, 802.11a, 802.11b, 802.11g), with each featuring different data rates, modulation schemes and transmission frequencies. However, all of them share a common medium access control (MAC) layer. As this is largely based on a contention approach, it does not allow prioritising of traffic or stations, so it cannot easily provide the quality of service (QoS) required by time-sensitive applications, such as voice or video transmission. In order to address this shortfall of the technology, the IEEE set up a task group that is aiming to enhance the MAC layer protocol so that it can provide QoS. The latest draft at the time of writing is Draft 11, dated October 2004. The article describes the yet-to-be-ratified 802.11e standard and is based on that draft.
Resumo:
TGR5 is a G protein-coupled receptor that mediates bile acid (BA) effects on energy balance, inflammation, digestion and sensation. The mechanisms and spatiotemporal control of TGR5 signaling are poorly understood. We investigated TGR5 signaling and trafficking in transfected HEK293 cells and colonocytes (NCM460) that endogenously express TGR5. BAs (deoxycholic acid, DCA, taurolithocholic acid, TLCA) and the selective agonists oleanolic acid (OA) and 3-(2-chlorophenyl)-N-(4-chlorophenyl)-N, 5-dimethylisoxazole-4-carboxamide (CCDC) stimulated cAMP formation but did not induce TGR5 endocytosis or recruitment of β-arrestins, assessed by confocal microscopy. DCA, TLCA and OA did not stimulate TGR5 association with β-arrestin 1/2 or G protein-coupled receptor kinase (GRK) 2/5/6, determined by bioluminescence resonance energy transfer. CCDC stimulated a low level of TGR5 interaction with β-arrestin2 and GRK2. DCA induced cAMP formation at the plasma membrane and cytosol, determined using exchange factor directly regulated by cAMP (Epac2)-based reporters, but cAMP signals did not desensitize. AG1478, an inhibitor of epidermal growth factor receptor (EGFR) tyrosine kinase, the metalloprotease inhibitor batimastat, and methyl-β-cyclodextrin and filipin, which block lipid raft formation, prevented DCA stimulation of extracellular signal regulated kinase (ERK1/2). BRET analysis revealed TGR5 and EGFR interactions that were blocked by disruption of lipid rafts. DCA stimulated TGR5 redistribution to plasma membrane microdomains, localized by immunogold electron microscopy. Thus, TGR5 does not interact with β-arrestins, desensitize or traffic to endosomes. TGR5 signals from plasma membrane rafts that facilitate EGFR interaction and transactivation. An understanding of the spatiotemporal control of TGR5 signaling provides insights into the actions of BAs and therapeutic TGR5 agonists/antagonists.