16 resultados para streaming SIMD extensions

em Universidade do Minho


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Oceans have shown tremendous importance and impact on our lives. Thus the need for monitoring and protecting the oceans has grown exponentially in recent years. On the other hand, oceans have economical and industrial potential in areas such as pharmaceutical, oil, minerals and biodiversity. This demand is increasing and the need for high data rate and near real-time communications between submerged agents became of paramount importance. Among the needs for underwater communications, streaming video (e.g. for inspecting risers or hydrothermal vents) can be seen as the top challenge, which when solved will make all the other applications possible. Presently, the only reliable approach for underwater video streaming relies on wired connections or tethers (e.g. from ROVs to the surface) which presents severe operational constraints that makes acoustic links together with AUVs and sensor networks strongly appealing. Using new polymer-based acoustic transducers, which in very recent works have shown to have bandwidth and power efficiency much higher than the usual ceramics, this article proposes the development of a reprogrammable acoustic modem for operating in underwater communications with video streaming capabilities. The results have shown a maximum data-rate of 1Mbps with a simple modulation scheme such as OOK, at a distance of 20 m.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract Dataflow programs are widely used. Each program is a directed graph where nodes are computations and edges indicate the flow of data. In prior work, we reverse-engineered legacy dataflow programs by deriving their optimized implementations from a simple specification graph using graph transformations called refinements and optimizations. In MDE-speak, our derivations were PIM-to-PSM mappings. In this paper, we show how extensions complement refinements, optimizations, and PIM-to-PSM derivations to make the process of reverse engineering complex legacy dataflow programs tractable. We explain how optional functionality in transformations can be encoded, thereby enabling us to encode product lines of transformations as well as product lines of dataflow programs. We describe the implementation of extensions in the ReFlO tool and present two non-trivial case studies as evidence of our work’s generality

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Business Intelligence (BI) can be seen as a method that gathers information and data from information systems in order to help companies to be more accurate in their decision-making process. Traditionally BI systems were associated with the use of Data Warehouses (DW). The prime purpose of DW is to serve as a repository that stores all the relevant information required for making the correct decision. The necessity to integrate streaming data became crucial with the need to improve the efficiency and effectiveness of the decision process. In primary and secondary education, there is a lack of BI solutions. Due to the schools reality the main purpose of this study is to provide a Pervasive BI solution able to monitoring the schools and student data anywhere and anytime in real-time as well as disseminating the information through ubiquitous devices. The first task consisted in gathering data regarding the different choices made by the student since his enrolment in a certain school year until the end of it. Thereafter a dimensional model was developed in order to be possible building a BI platform. This paper presents the dimensional model, a set of pre-defined indicators, the Pervasive Business Intelligence characteristics and the prototype designed. The main contribution of this study was to offer to the schools a tool that could help them to make accurate decisions in real-time. Data dissemination was achieved through a localized application that can be accessed anywhere and anytime.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The receiver-operating characteristic (ROC) curve is the most widely used measure for evaluating the performance of a diagnostic biomarker when predicting a binary disease outcome. The ROC curve displays the true positive rate (or sensitivity) and the false positive rate (or 1-specificity) for different cut-off values used to classify an individual as healthy or diseased. In time-to-event studies, however, the disease status (e.g. death or alive) of an individual is not a fixed characteristic, and it varies along the study. In such cases, when evaluating the performance of the biomarker, several issues should be taken into account: first, the time-dependent nature of the disease status; and second, the presence of incomplete data (e.g. censored data typically present in survival studies). Accordingly, to assess the discrimination power of continuous biomarkers for time-dependent disease outcomes, time-dependent extensions of true positive rate, false positive rate, and ROC curve have been recently proposed. In this work, we present new nonparametric estimators of the cumulative/dynamic time-dependent ROC curve that allow accounting for the possible modifying effect of current or past covariate measures on the discriminatory power of the biomarker. The proposed estimators can accommodate right-censored data, as well as covariate-dependent censoring. The behavior of the estimators proposed in this study will be explored through simulations and illustrated using data from a cohort of patients who suffered from acute coronary syndrome.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação de mestrado em Economia Industrial e da Empresa

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ATLAS experiment at the LHC has measured the Higgs boson couplings and mass, and searched for invisible Higgs boson decays, using multiple production and decay channels with up to 4.7 fb−1 of pp collision data at √s=7 TeV and 20.3 fb−1 at √s=8 TeV. In the current study, the measured production and decay rates of the observed Higgs boson in the γγ, ZZ, W W , Zγ, bb, τ τ , and μμ decay channels, along with results from the associated production of a Higgs boson with a top-quark pair, are used to probe the scaling of the couplings with mass. Limits are set on parameters in extensions of the Standard Model including a composite Higgs boson, an additional electroweak singlet, and two-Higgs-doublet models. Together with the measured mass of the scalar Higgs boson in the γγ and ZZ decay modes, a lower limit is set on the pseudoscalar Higgs boson mass of m A > 370 GeV in the “hMSSM” simplified Minimal Supersymmetric Standard Model. Results from direct searches for heavy Higgs bosons are also interpreted in the hMSSM. Direct searches for invisible Higgs boson decays in the vector-boson fusion and associated production of a Higgs boson with W/Z (Z → ℓℓ, W/Z → jj) modes are statistically combined to set an upper limit on the Higgs boson invisible branching ratio of 0.25. The use of the measured visible decay rates in a more general coupling fit improves the upper limit to 0.23, constraining a Higgs portal model of dark matter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many extensions of the Standard Model predict the existence of charged heavy long-lived particles, such as R-hadrons or charginos. These particles, if produced at the Large Hadron Collider, should be moving non-relativistically and are therefore identifiable through the measurement of an anomalously large specific energy loss in the ATLAS pixel detector. Measuring heavy long-lived particles through their track parameters in the vicinity of the interaction vertex provides sensitivity to metastable particles with lifetimes from 0.6 ns to 30 ns. A search for such particles with the ATLAS detector at the Large Hadron Collider is presented, based on a data sample corresponding to an integrated luminosity of 18.4 fb−1 of pp collisions at s√ = 8 TeV. No significant deviation from the Standard Model background expectation is observed, and lifetime-dependent upper limits on R-hadrons and chargino production are set. Gluino R-hadrons with 10 ns lifetime and masses up to 1185 GeV are excluded at 95% confidence level, and so are charginos with 15 ns lifetime and masses up to 482 GeV.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many extensions of the Standard Model posit the existence of heavy particles with long lifetimes. This article presents the results of a search for events containing at least one long-lived particle that decays at a significant distance from its production point into two leptons or into five or more charged particles. This analysis uses a data sample of proton-proton collisions at s√ = 8 TeV corresponding to an integrated luminosity of 20.3 fb−1 collected in 2012 by the ATLAS detector operating at the Large Hadron Collider. No events are observed in any of the signal regions, and limits are set on model parameters within supersymmetric scenarios involving R-parity violation, split supersymmetry, and gauge mediation. In some of the search channels, the trigger and search strategy are based only on the decay products of individual long-lived particles, irrespective of the rest of the event. In these cases, the provided limits can easily be reinterpreted in different scenarios.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tese de Doutoramento (Programa Doutoral em Engenharia Biomédica)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia Civil

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação de mestrado em Educação Especial (área de especialização em Dificuldades de Aprendizagem Específicas)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mechanical Ventilation is an artificial way to help a Patient to breathe. This procedure is used to support patients with respiratory diseases however in many cases it can provoke lung damages, Acute Respiratory Diseases or organ failure. With the goal to early detect possible patient breath problems a set of limit values was defined to some variables monitored by the ventilator (Average Ventilation Pressure, Compliance Dynamic, Flow, Peak, Plateau and Support Pressure, Positive end-expiratory pressure, Respiratory Rate) in order to create critical events. A critical event is verified when a patient has a value higher or lower than the normal range defined for a certain period of time. The values were defined after elaborate a literature review and meeting with physicians specialized in the area. This work uses data streaming and intelligent agents to process the values collected in real-time and classify them as critical or not. Real data provided by an Intensive Care Unit were used to design and test the solution. In this study it was possible to understand the importance of introduce critical events for Mechanically Ventilated Patients. In some cases a value is considered critical (can trigger an alarm) however it is a single event (instantaneous) and it has not a clinical significance for the patient. The introduction of critical events which crosses a range of values and a pre-defined duration contributes to improve the decision-making process by decreasing the number of false positives and having a better comprehension of the patient condition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The data acquisition process in real-time is fundamental to provide appropriate services and improve health professionals decision. In this paper a pervasive adaptive data acquisition architecture of medical devices (e.g. vital signs, ventilators and sensors) is presented. The architecture was deployed in a real context in an Intensive Care Unit. It is providing clinical data in real-time to the INTCare system. The gateway is composed by several agents able to collect a set of patients’ variables (vital signs, ventilation) across the network. The paper shows as example the ventilation acquisition process. The clients are installed in a machine near the patient bed. Then they are connected to the ventilators and the data monitored is sent to a multithreading server which using Health Level Seven protocols records the data in the database. The agents associated to gateway are able to collect, analyse, interpret and store the data in the repository. This gateway is composed by a fault tolerant system that ensures a data store in the database even if the agents are disconnected. The gateway is pervasive, universal, and interoperable and it is able to adapt to any service using streaming data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hospitals have multiple data sources, such as embedded systems, monitors and sensors. The number of data available is increasing and the information are used not only to care the patient but also to assist the decision processes. The introduction of intelligent environments in health care institutions has been adopted due their ability to provide useful information for health professionals, either in helping to identify prognosis or also to understand patient condition. Behind of this concept arises this Intelligent System to track patient condition (e.g. critic events) in health care. This system has the great advantage of being adaptable to the environment and user needs. The system is focused in identifying critic events from data streaming (e.g. vital signs and ventilation) which is particularly valuable for understanding the patient’s condition. This work aims to demonstrate the process of creating an intelligent system capable of operating in a real environment using streaming data provided by ventilators and vital signs monitors. Its development is important to the physician because becomes possible crossing multiple variables in real-time by analyzing if a value is critic or not and if their variation has or not clinical importance.