798 resultados para Data-Intensive Science


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article revisits Michel Chevalier’s work and discussions of tariffs. Chevalier shifted from Saint-Simonism to economic liberalism during his life in the 19th century. His influence was soon perceived in the political world and economic debates, mainly because of his discussion of tariffs as instruments of efficient transport policies. This work discusses Chevalier’s thoughts on tariffs by revisiting his masterpiece, Le Cours d’Économie Politique. Data Envelopment Analysis (DEA) was conducted to test Chevalier’s hypothesis on the inefficiency of French tariffs. This work showed that Chevalier’s claims on French tariffs are not validated by DEA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

FOSTER aims to support different stakeholders, especially young researchers, in adopting open access in the context of the European Research Area (ERA) and in complying with the open access policies and rules of participation set out for Horizon 2020 (H2020). FOSTER establish a European-wide training programme on open access and open data, consolidating training activities at downstream level and reaching diverse disciplinary communities and countries in the ERA. The training programme includes different approaches and delivery options: elearning, blearning, self-learning, dissemination of training materials/contents, helpdesk, face-to-face training, especially training-the-trainers, summer schools, seminars, etc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large scale distributed data stores rely on optimistic replication to scale and remain highly available in the face of net work partitions. Managing data without coordination results in eventually consistent data stores that allow for concurrent data updates. These systems often use anti-entropy mechanisms (like Merkle Trees) to detect and repair divergent data versions across nodes. However, in practice hash-based data structures are too expensive for large amounts of data and create too many false conflicts. Another aspect of eventual consistency is detecting write conflicts. Logical clocks are often used to track data causality, necessary to detect causally concurrent writes on the same key. However, there is a nonnegligible metadata overhead per key, which also keeps growing with time, proportional with the node churn rate. Another challenge is deleting keys while respecting causality: while the values can be deleted, perkey metadata cannot be permanently removed without coordination. Weintroduceanewcausalitymanagementframeworkforeventuallyconsistentdatastores,thatleveragesnodelogicalclocks(BitmappedVersion Vectors) and a new key logical clock (Dotted Causal Container) to provides advantages on multiple fronts: 1) a new efficient and lightweight anti-entropy mechanism; 2) greatly reduced per-key causality metadata size; 3) accurate key deletes without permanent metadata.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genome-scale metabolic models are valuable tools in the metabolic engineering process, based on the ability of these models to integrate diverse sources of data to produce global predictions of organism behavior. At the most basic level, these models require only a genome sequence to construct, and once built, they may be used to predict essential genes, culture conditions, pathway utilization, and the modifications required to enhance a desired organism behavior. In this chapter, we address two key challenges associated with the reconstruction of metabolic models: (a) leveraging existing knowledge of microbiology, biochemistry, and available omics data to produce the best possible model; and (b) applying available tools and data to automate the reconstruction process. We consider these challenges as we progress through the model reconstruction process, beginning with genome assembly, and culminating in the integration of constraints to capture the impact of transcriptional regulation. We divide the reconstruction process into ten distinct steps: (1) genome assembly from sequenced reads; (2) automated structural and functional annotation; (3) phylogenetic tree-based curation of genome annotations; (4) assembly and standardization of biochemistry database; (5) genome-scale metabolic reconstruction; (6) generation of core metabolic model; (7) generation of biomass composition reaction; (8) completion of draft metabolic model; (9) curation of metabolic model; and (10) integration of regulatory constraints. Each of these ten steps is documented in detail.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mechanical Ventilation is an artificial way to help a Patient to breathe. This procedure is used to support patients with respiratory diseases however in many cases it can provoke lung damages, Acute Respiratory Diseases or organ failure. With the goal to early detect possible patient breath problems a set of limit values was defined to some variables monitored by the ventilator (Average Ventilation Pressure, Compliance Dynamic, Flow, Peak, Plateau and Support Pressure, Positive end-expiratory pressure, Respiratory Rate) in order to create critical events. A critical event is verified when a patient has a value higher or lower than the normal range defined for a certain period of time. The values were defined after elaborate a literature review and meeting with physicians specialized in the area. This work uses data streaming and intelligent agents to process the values collected in real-time and classify them as critical or not. Real data provided by an Intensive Care Unit were used to design and test the solution. In this study it was possible to understand the importance of introduce critical events for Mechanically Ventilated Patients. In some cases a value is considered critical (can trigger an alarm) however it is a single event (instantaneous) and it has not a clinical significance for the patient. The introduction of critical events which crosses a range of values and a pre-defined duration contributes to improve the decision-making process by decreasing the number of false positives and having a better comprehension of the patient condition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research work explores a new way of presenting and representing information about patients in critical care, which is the use of a timeline to display information. This is accomplished with the development of an interactive Pervasive Patient Timeline able to give to the intensivists an access in real-time to an environment containing patients clinical information from the moment in which the patients are admitted in the Intensive Care Unit (ICU) until their discharge This solution allows the intensivists to analyse data regarding vital signs, medication, exams, data mining predictions, among others. Due to the pervasive features, intensivists can have access to the timeline anywhere and anytime, allowing them to make decisions when they need to be made. This platform is patient-centred and is prepared to support the decision process allowing the intensivists to provide better care to patients due the inclusion of clinical forecasts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The data acquisition process in real-time is fundamental to provide appropriate services and improve health professionals decision. In this paper a pervasive adaptive data acquisition architecture of medical devices (e.g. vital signs, ventilators and sensors) is presented. The architecture was deployed in a real context in an Intensive Care Unit. It is providing clinical data in real-time to the INTCare system. The gateway is composed by several agents able to collect a set of patients’ variables (vital signs, ventilation) across the network. The paper shows as example the ventilation acquisition process. The clients are installed in a machine near the patient bed. Then they are connected to the ventilators and the data monitored is sent to a multithreading server which using Health Level Seven protocols records the data in the database. The agents associated to gateway are able to collect, analyse, interpret and store the data in the repository. This gateway is composed by a fault tolerant system that ensures a data store in the database even if the agents are disconnected. The gateway is pervasive, universal, and interoperable and it is able to adapt to any service using streaming data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The decision support models in intensive care units are developed to support medical staff in their decision making process. However, the optimization of these models is particularly difficult to apply due to dynamic, complex and multidisciplinary nature. Thus, there is a constant research and development of new algorithms capable of extracting knowledge from large volumes of data, in order to obtain better predictive results than the current algorithms. To test the optimization techniques a case study with real data provided by INTCare project was explored. This data is concerning to extubation cases. In this dataset, several models like Evolutionary Fuzzy Rule Learning, Lazy Learning, Decision Trees and many others were analysed in order to detect early extubation. The hydrids Decision Trees Genetic Algorithm, Supervised Classifier System and KNNAdaptive obtained the most accurate rate 93.2%, 93.1%, 92.97% respectively, thus showing their feasibility to work in a real environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Driven by concerns about rising energy costs, security of supply and climate change a new wave of Sustainable Energy Technologies (SET’s) have been embraced by the Irish consumer. Such systems as solar collectors, heat pumps and biomass boilers have become common due to government backed financial incentives and revisions of the building regulations. However, there is a deficit of knowledge and understanding of how these technologies operate and perform under Ireland’s maritime climate. This AQ-WBL project was designed to address both these needs by developing a Data Acquisition (DAQ) system to monitor the performance of such technologies and a web-based learning environment to disseminate performance characteristics and supplementary information about these systems. A DAQ system consisting of 108 sensors was developed as part of Galway-Mayo Institute of Technology’s (GMIT’s) Centre for the Integration of Sustainable EnergyTechnologies (CiSET) in an effort to benchmark the performance of solar thermal collectors and Ground Source Heat Pumps (GSHP’s) under Irish maritime climate, research new methods of integrating these systems within the built environment and raise awareness of SET’s. It has operated reliably for over 2 years and has acquired over 25 million data points. Raising awareness of these SET’s is carried out through the dissemination of the performance data through an online learning environment. A learning environment was created to provide different user groups with a basic understanding of a SET’s with the support of performance data, through a novel 5 step learning process and two examples were developed for the solar thermal collectors and the weather station which can be viewed at http://www.kdp 1 .aquaculture.ie/index.aspx. This online learning environment has been demonstrated to and well received by different groups of GMIT’s undergraduate students and plans have been made to develop it further to support education, awareness, research and regional development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Almost half of Ireland’s commercial stocks face overexploitation. As traditional species decrease in abundance and become less profitable, the industry is increasingly turning to alternate species. Atlantic saury (Scomberesox saurus saurus (Walbaum)) has been identified as a potential species for exploitation. Very little information is available on its biology or population dynamics, especially for Irish waters. This thesis aims to obtain sound scientific data, which will help to ensure that a future Atlantic saury fishery can be sustainably managed. The research has produced valuable data, some of which contradicts previous studies. Growth of Atlantic saury measured using otolith microstructure is found to be more than twice that previously calculated from annual structures on scales and otoliths. This results in a significant reduction of the expected life span from five to about two years. Investigation of maturity stage at age indicates that Atlantic saury will reproduce for the first time at age one and will survive for one or at most two reproduction seasons. It is concluded that a future Irish fishery will target mostly fish prior to their first reproduction. Finally the thesis gives some insights into the population structure of Atlantic saury, by analysis of otolith morphometric. Significant differences are detected between Northeastern Atlantic and western Mediterranean Sea specimens of the 0+ age class (less than one year old). The implications of these results for the management of an emerging fishery are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Visualistics, computer science, picture syntax, picture semantics, picture pragmatics, interactive pictures