977 resultados para Embedded systems, real-time control, Scilab, Linux, development


Relevância:

100.00% 100.00%

Publicador:

Resumo:

IP-verkkojen hyvin tunnettu haitta on, että nämä eivät pysty takaamaan tiettyä palvelunlaatua (Quality of Service) lähetetyille paketeille. Seuraavat kaksi tekniikkaa pidetään lupaavimpina palvelunlaadun tarjoamiselle: Differentiated Services (DiffServ) ja palvelunlaatureititys (QoS Routing). DiffServ on varsin uusi IETF:n määrittelemä Internetille tarkoitettu palvelunlaatumekanismi. DiffServ tarjoaa skaalattavaa palvelujen erilaistamista ilman viestintää joka hypyssä ja per-flow –tilan ohjausta. DiffServ on hyvä esimerkki hajautetusta verkkosuunnittelusta. Tämän palvelutasomekanismin tavoite on viestintäjärjestelmien suunnittelun yksinkertaistaminen. Verkkosolmu voidaan rakentaa pienestä hyvin määritellystä rakennuspalikoiden joukosta. Palvelunlaatureititys on reititysmekanismi, jolla liikennereittejä määritellään verkon käytettävissä olevien resurssien pohjalta. Tässä työssä selvitetään uusi palvelunlaatureititystapa, jota kutsutaan yksinkertaiseksi monitiereititykseksi (Simple Multipath Routing). Tämän työn tarkoitus on suunnitella palvelunlaatuohjain DiffServille. Tässä työssä ehdotettu palvelunlaatuohjain on pyrkimys yhdistää DiffServ ja palvelunlaatureititysmekanismeja. Työn kokeellinen osuus keskittyy erityisesti palvelunlaatureititysalgoritmeihin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: Adequate empirical antibiotic dose selection for critically ill burn patients is difficult due to extreme variability in drug pharmacokinetics. Therapeutic drug monitoring (TDM) may aid antibiotic prescription and implementation of initial empirical antimicrobial dosage recommendations. This study evaluated how gradual TDM introduction altered empirical dosages of meropenem and imipenem/cilastatin in our burn ICU. METHODS: Imipenem/cilastatin and meropenem use and daily empirical dosage at a five-bed burn ICU were analyzed retrospectively. Data for all burn admissions between 2001 and 2011 were extracted from the hospital's computerized information system. For each patient receiving a carbapenem, episodes of infection were reviewed and scored according to predefined criteria. Carbapenem trough serum levels were characterized. Prior to May 2007, TDM was available only by special request. Real-time carbapenem TDM was introduced in June 2007; it was initially available weekly and has been available 4 days a week since 2010. RESULTS: Of 365 patients, 229 (63%) received antibiotics (109 received carbapenems). Of 23 TDM determinations for imipenem/cilastatin, none exceeded the predefined upper limit and 11 (47.8%) were insufficient; the number of TDM requests was correlated with daily dose (r=0.7). Similar numbers of inappropriate meropenem trough levels (30.4%) were below and above the upper limit. Real-time TDM introduction increased the empirical dose of imipenem/cilastatin, but not meropenem. CONCLUSIONS: Real-time carbapenem TDM availability significantly altered the empirical daily dosage of imipenem/cilastatin at our burn ICU. Further studies are needed to evaluate the individual impact of TDM-based antibiotic adjustment on infection outcomes in these patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes the process of the integration of a real-time simulator environment with a motion platform and a haptic device as a part of the Kvalive project. Several programs running on two computers were made to control the different devices of the environment. User tests were made to obtain information of needed improvements to make the simulator more realistic. Also new ideas for improving the simulator and directions of further research were obtained with the help of this research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When modeling machines in their natural working environment collisions become a very important feature in terms of simulation accuracy. By expanding the simulation to include the operation environment, the need for a general collision model that is able to handle a wide variety of cases has become central in the development of simulation environments. With the addition of the operating environment the challenges for the collision modeling method also change. More simultaneous contacts with more objects occur in more complicated situations. This means that the real-time requirement becomes more difficult to meet. Common problems in current collision modeling methods include for example dependency on the geometry shape or mesh density, calculation need increasing exponentially in respect to the number of contacts, the lack of a proper friction model and failures due to certain configurations like closed kinematic loops. All these problems mean that the current modeling methods will fail in certain situations. A method that would not fail in any situation is not very realistic but improvements can be made over the current methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Myocardial contrast echocardiography has been used for assessing myocardial perfusion. Some concerns regarding its safety still remain, mainly regarding the induction of microvascular alterations. We sought to determine the bioeffects of microbubbles and real-time myocardial contrast echocardiography (RTMCE) in a closed-chest canine model. Eighteen mongrel dogs were randomly assigned to two groups. Nine were submitted to continuous intravenous infusion of perfluorocarbon-exposed sonicated dextrose albumin (PESDA) plus continuous imaging using power pulse inversion RTMCE for 180 min, associated with manually deflagrated high-mechanical index impulses. The control group consisted of 3 dogs submitted to continuous imaging using RTMCE without PESDA, 3 dogs received PESDA alone, and 3 dogs were sham-operated. Hemodynamics and cardiac rhythm were monitored continuously. Histological analysis was performed on cardiac and pulmonary tissues. No hemodynamic changes or cardiac arrhythmias were observed in any group. Normal left ventricular ejection fraction and myocardial perfusion were maintained throughout the protocol. Frequency of mild and focal microhemorrhage areas in myocardial and pulmonary tissue was similar in PESDA plus RTMCE and control groups. The percentages of positive microscopical fields in the myocardium were 0.4 and 0.7% (P = NS) in the PESDA plus RTMCE and control groups, respectively, and in the lungs they were 2.1 and 1.1%, respectively (P = NS). In this canine model, myocardial perfusion imaging obtained with PESDA and RTMCE was safe, with no alteration in cardiac rhythm or left ventricular function. Mild and focal myocardial and pulmonary microhemorrhages were observed in both groups, and may be attributed to surgical tissue manipulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing presence of products derived from genetically modified (GM) plants in human and animal diets has led to the development of detection methods to distinguish biotechnology-derived foods from conventional ones. The conventional and real-time PCR have been used, respectively, to detect and quantify GM residues in highly processed foods. DNA extraction is a critical step during the analysis process. Some factors such as DNA degradation, matrix effects, and the presence of PCR inhibitors imply that a detection or quantification limit, established for a given method, is restricted to a matrix used during validation and cannot be projected to any other matrix outside the scope of the method. In Brazil, sausage samples were the main class of processed products in which Roundup Ready® (RR) soybean residues were detected. Thus, the validation of methodologies for the detection and quantification of those residues is absolutely necessary. Sausage samples were submitted to two different methods of DNA extraction: modified Wizard and the CTAB method. The yield and quality were compared for both methods. DNA samples were analyzed by conventional and real-time PCR for the detection and quantification of Roundup Ready® soybean in the samples. At least 200 ng of total sausage DNA was necessary for a reliable quantification. Reactions containing DNA amounts below this value led to large variations on the expected GM percentage value. In conventional PCR, the detection limit varied from 1.0 to 500 ng, depending on the GM soybean content in the sample. The precision, performance, and linearity were relatively high indicating that the method used for analysis was satisfactory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die zunehmende Vernetzung der Informations- und Kommunikationssysteme führt zu einer weiteren Erhöhung der Komplexität und damit auch zu einer weiteren Zunahme von Sicherheitslücken. Klassische Schutzmechanismen wie Firewall-Systeme und Anti-Malware-Lösungen bieten schon lange keinen Schutz mehr vor Eindringversuchen in IT-Infrastrukturen. Als ein sehr wirkungsvolles Instrument zum Schutz gegenüber Cyber-Attacken haben sich hierbei die Intrusion Detection Systeme (IDS) etabliert. Solche Systeme sammeln und analysieren Informationen von Netzwerkkomponenten und Rechnern, um ungewöhnliches Verhalten und Sicherheitsverletzungen automatisiert festzustellen. Während signatur-basierte Ansätze nur bereits bekannte Angriffsmuster detektieren können, sind anomalie-basierte IDS auch in der Lage, neue bisher unbekannte Angriffe (Zero-Day-Attacks) frühzeitig zu erkennen. Das Kernproblem von Intrusion Detection Systeme besteht jedoch in der optimalen Verarbeitung der gewaltigen Netzdaten und der Entwicklung eines in Echtzeit arbeitenden adaptiven Erkennungsmodells. Um diese Herausforderungen lösen zu können, stellt diese Dissertation ein Framework bereit, das aus zwei Hauptteilen besteht. Der erste Teil, OptiFilter genannt, verwendet ein dynamisches "Queuing Concept", um die zahlreich anfallenden Netzdaten weiter zu verarbeiten, baut fortlaufend Netzverbindungen auf, und exportiert strukturierte Input-Daten für das IDS. Den zweiten Teil stellt ein adaptiver Klassifikator dar, der ein Klassifikator-Modell basierend auf "Enhanced Growing Hierarchical Self Organizing Map" (EGHSOM), ein Modell für Netzwerk Normalzustand (NNB) und ein "Update Model" umfasst. In dem OptiFilter werden Tcpdump und SNMP traps benutzt, um die Netzwerkpakete und Hostereignisse fortlaufend zu aggregieren. Diese aggregierten Netzwerkpackete und Hostereignisse werden weiter analysiert und in Verbindungsvektoren umgewandelt. Zur Verbesserung der Erkennungsrate des adaptiven Klassifikators wird das künstliche neuronale Netz GHSOM intensiv untersucht und wesentlich weiterentwickelt. In dieser Dissertation werden unterschiedliche Ansätze vorgeschlagen und diskutiert. So wird eine classification-confidence margin threshold definiert, um die unbekannten bösartigen Verbindungen aufzudecken, die Stabilität der Wachstumstopologie durch neuartige Ansätze für die Initialisierung der Gewichtvektoren und durch die Stärkung der Winner Neuronen erhöht, und ein selbst-adaptives Verfahren eingeführt, um das Modell ständig aktualisieren zu können. Darüber hinaus besteht die Hauptaufgabe des NNB-Modells in der weiteren Untersuchung der erkannten unbekannten Verbindungen von der EGHSOM und der Überprüfung, ob sie normal sind. Jedoch, ändern sich die Netzverkehrsdaten wegen des Concept drif Phänomens ständig, was in Echtzeit zur Erzeugung nicht stationärer Netzdaten führt. Dieses Phänomen wird von dem Update-Modell besser kontrolliert. Das EGHSOM-Modell kann die neuen Anomalien effektiv erkennen und das NNB-Model passt die Änderungen in Netzdaten optimal an. Bei den experimentellen Untersuchungen hat das Framework erfolgversprechende Ergebnisse gezeigt. Im ersten Experiment wurde das Framework in Offline-Betriebsmodus evaluiert. Der OptiFilter wurde mit offline-, synthetischen- und realistischen Daten ausgewertet. Der adaptive Klassifikator wurde mit dem 10-Fold Cross Validation Verfahren evaluiert, um dessen Genauigkeit abzuschätzen. Im zweiten Experiment wurde das Framework auf einer 1 bis 10 GB Netzwerkstrecke installiert und im Online-Betriebsmodus in Echtzeit ausgewertet. Der OptiFilter hat erfolgreich die gewaltige Menge von Netzdaten in die strukturierten Verbindungsvektoren umgewandelt und der adaptive Klassifikator hat sie präzise klassifiziert. Die Vergleichsstudie zwischen dem entwickelten Framework und anderen bekannten IDS-Ansätzen zeigt, dass der vorgeschlagene IDSFramework alle anderen Ansätze übertrifft. Dies lässt sich auf folgende Kernpunkte zurückführen: Bearbeitung der gesammelten Netzdaten, Erreichung der besten Performanz (wie die Gesamtgenauigkeit), Detektieren unbekannter Verbindungen und Entwicklung des in Echtzeit arbeitenden Erkennungsmodells von Eindringversuchen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this report, a face recognition system that is capable of detecting and recognizing frontal and rotated faces was developed. Two face recognition methods focusing on the aspect of pose invariance are presented and evaluated - the whole face approach and the component-based approach. The main challenge of this project is to develop a system that is able to identify faces under different viewing angles in realtime. The development of such a system will enhance the capability and robustness of current face recognition technology. The whole-face approach recognizes faces by classifying a single feature vector consisting of the gray values of the whole face image. The component-based approach first locates the facial components and extracts them. These components are normalized and combined into a single feature vector for classification. The Support Vector Machine (SVM) is used as the classifier for both approaches. Extensive tests with respect to the robustness against pose changes are performed on a database that includes faces rotated up to about 40 degrees in depth. The component-based approach clearly outperforms the whole-face approach on all tests. Although this approach isproven to be more reliable, it is still too slow for real-time applications. That is the reason why a real-time face recognition system using the whole-face approach is implemented to recognize people in color video sequences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

At the time of a customer order, the e-tailer assigns the order to one or more of its order fulfillment centers, and/or to drop shippers, so as to minimize procurement and transportation costs, based on the available current information. However this assignment is necessarily myopic as it cannot account for all future events, such as subsequent customer orders or inventory replenishments. We examine the potential benefits from periodically re-evaluating these real-time order-assignment decisions. We construct near-optimal heuristics for the re-assignment for a large set of customer orders with the objective to minimize the total number of shipments. We investigate how best to implement these heuristics for a rolling horizon, and discuss the effect of demand correlation, customer order size, and the number of customer orders on the nature of the heuristics. Finally, we present potential saving opportunities by testing the heuristics on sets of order data from a major e-tailer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We develop an extension to the tactical planning model (TPM) for a job shop by the third author. The TPM is a discrete-time model in which all transitions occur at the start of each time period. The time period must be defined appropriately in order for the model to be meaningful. Each period must be short enough so that a job is unlikely to travel through more than one station in one period. At the same time, the time period needs to be long enough to justify the assumptions of continuous workflow and Markovian job movements. We build an extension to the TPM that overcomes this restriction of period sizing by permitting production control over shorter time intervals. We achieve this by deriving a continuous-time linear control rule for a single station. We then determine the first two moments of the production level and queue length for the workstation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reinforcement learning (RL) is a very suitable technique for robot learning, as it can learn in unknown environments and in real-time computation. The main difficulties in adapting classic RL algorithms to robotic systems are the generalization problem and the correct observation of the Markovian state. This paper attempts to solve the generalization problem by proposing the semi-online neural-Q_learning algorithm (SONQL). The algorithm uses the classic Q_learning technique with two modifications. First, a neural network (NN) approximates the Q_function allowing the use of continuous states and actions. Second, a database of the most representative learning samples accelerates and stabilizes the convergence. The term semi-online is referred to the fact that the algorithm uses the current but also past learning samples. However, the algorithm is able to learn in real-time while the robot is interacting with the environment. The paper shows simulated results with the "mountain-car" benchmark and, also, real results with an underwater robot in a target following behavior

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper deals with the problem of navigation for an unmanned underwater vehicle (UUV) through image mosaicking. It represents a first step towards a real-time vision-based navigation system for a small-class low-cost UUV. We propose a navigation system composed by: (i) an image mosaicking module which provides velocity estimates; and (ii) an extended Kalman filter based on the hydrodynamic equation of motion, previously identified for this particular UUV. The obtained system is able to estimate the position and velocity of the robot. Moreover, it is able to deal with visual occlusions that usually appear when the sea bottom does not have enough visual features to solve the correspondence problem in a certain area of the trajectory

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Real-time geoparsing of social media streams (e.g. Twitter, YouTube, Instagram, Flickr, FourSquare) is providing a new 'virtual sensor' capability to end users such as emergency response agencies (e.g. Tsunami early warning centres, Civil protection authorities) and news agencies (e.g. Deutsche Welle, BBC News). Challenges in this area include scaling up natural language processing (NLP) and information retrieval (IR) approaches to handle real-time traffic volumes, reducing false positives, creating real-time infographic displays useful for effective decision support and providing support for trust and credibility analysis using geosemantics. I will present in this seminar on-going work by the IT Innovation Centre over the last 4 years (TRIDEC and REVEAL FP7 projects) in building such systems, and highlights our research towards improving trustworthy and credible of crisis map displays and real-time analytics for trending topics and influential social networks during major news worthy events.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims: All members of the ruminal Butyrivibrio group convert linoleic acid (cis-9,cis-12-18 : 2) via conjugated 18 : 2 metabolites (mainly cis-9,trans-11-18 : 2, conjugated linoleic acid) to vaccenic acid (trans-11-18 : 1), but only members of a small branch, which includes Clostridium proteoclasticum, of this heterogeneous group further reduce vaccenic acid to stearic acid (18 : 0, SA). The aims of this study were to develop a real-time polymerase chain reaction (PCR) assay that would detect and quantify these key SA producers and to use this method to detect diet-associated changes in their populations in ruminal digesta of lactating cows. Materials and Results: The use of primers targeting the 16S rRNA gene of Cl. proteoclasticum was not sufficiently specific when only binding dyes were used for detection in real-time PCR. Their sequences were too similar to some nonproducing strains. A molecular beacon probe was designed specifically to detect and quantify the 16S rRNA genes of the Cl. proteoclasticum subgroup. The probe was characterized by its melting curve and validated using five SA-producing and ten nonproducing Butyrivibrio-like strains and 13 other common ruminal bacteria. Analysis of ruminal digesta collected from dairy cows fed different proportions of starch and fibre indicated a Cl. proteoclasticum population of 2-9% of the eubacterial community. The influence of diet on numbers of these bacteria was less than variations between individual cows. Conclusion: A molecular beacon approach in qPCR enables the detection of Cl. proteoclasticum in ruminal digesta. Their numbers are highly variable between individual animals. Signifance and Impact of the Study: SA producers are fundamental to the flow of polyunsaturated fatty acid and vaccenic acid from the rumen. The method described here enabled preliminary information to be obtained about the size of this population. Further application of the method to digesta samples from cows fed diets of more variable composition should enable us to understand how to control these bacteria in order to enhance the nutritional characteristics of ruminant-derived foods, including milk and beef.