956 resultados para Event data recorders.


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report compares the legal status of research data in the four KE partner countries. The report also addresses where European copyright and database law poses flaws and obstacles to the access to research data and singles out pre-conditions for openly available data. Background of the study Intellectual property right regulations regarding primary research data are a recurrent topic in the discussion on the improvement of access to research data. In fact in the final report of the High Level Expert Group on Scientific Data ‘Riding the Wave’ creating clarity on this was considered very important in improving awareness for all parties involved. According to the recommendations of the report legal issues should be “worked out so that they encourage, and not impede, global data sharing” http://cordis.europa.eu/fp7/ict/e-infrastructure/docs/hlg-sdi-report.pdf. While open access to research data is a widely recognised goal, achieving it remains a challenge. As European national laws still diverge and sometimes remain unclear it can be difficult for interested parties to fully comprehend in which ways open access to research data can be legally obtained. Based on these discussions the Knowledge Exchange working group on primary research data has commissioned a comparative report on the legal status of research data in the four KE partner countries. The study has been conducted by the Centre for Intellectual Property Law (CIER) at Utrecht University. The report aims at informing Knowledge Exchange and associated stakeholders on the state of the law concerning access to research data in the KE partner countries (Germany, Denmark, the Netherlands, and the United Kingdom) and to give an insight in how these laws work in practice. This is explained in several characteristic situations pertaining to open access to research data. The purpose of the report is to identify flaws and obstacles to the access to research data and to single out pre-conditions for openly available data. This is in view of the current discussions concerning open access to research data, especially those originating from publicly funded research. The report intends to be both a description of the status quo of the legislation and a practical instrument to prepare further activities in raising awareness on the potential benefit of improved access to research data, and developing means to support the improved access for research purposes

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Alliance for Coastal Technologies (ACT) Workshop on Towed Vehicles: Undulating Platforms As Tools for Mapping Coastal Processes and Water Quality Assessment was convened February 5-7,2007 at The Embassy Suites Hotel, Seaside, California and sponsored by the ACT-Pacific Coast partnership at the Moss Landing Marine Laboratories (MLML). The TUV workshop was co-chaired by Richard Burt (Chelsea Technology Group) and Stewart Lamerdin (MLML Marine Operations). Invited participants were selected to provide a uniform representation of the academic researchers, private sector product developers, and existing and potential data product users from the resource management community to enable development of broad consensus opinions on the application of TUV platforms in coastal resource assessment and management. The workshop was organized to address recognized limitations of point-based monitoring programs, which, while providing valuable data, are incapable of describing the spatial heterogeneity and the extent of features distributed in the bulk solution. This is particularly true as surveys approach the coastal zone where tidal and estuarine influences result in spatially and temporally heterogeneous water masses and entrained biological components. Aerial or satellite based remote sensing can provide an assessment of the aerial extent of plumes and blooms, yet provide no information regarding the third dimension of these features. Towed vehicles offer a cost-effective solution to this problem by providing platforms, which can sample in the horizontal, vertical, and time-based domains. Towed undulating vehicles (henceforth TUVs) represent useful platforms for event-response characterization. This workshop reviewed the current status of towed vehicle technology focusing on limitations of depth, data telemetry, instrument power demands, and ship requirements in an attempt to identify means to incorporate such technology more routinely in monitoring and event-response programs. Specifically, the participants were charged to address the following: (1) Summarize the state of the art in TUV technologies; (2) Identify how TUV platforms are used and how they can assist coastal managers in fulfilling their regulatory and management responsibilities; (3) Identify barriers and challenges to the application of TUV technologies in management and research activities, and (4) Recommend a series of community actions to overcome identified barriers and challenges. A series of plenary presentation were provided to enhance subsequent breakout discussions by the participants. Dave Nelson (University of Rhode Island) provided extensive summaries and real-world assessment of the operational features of a variety of TUV platforms available in the UNOLs scientific fleet. Dr. Burke Hales (Oregon State University) described the modification of TUV to provide a novel sampling platform for high resolution mapping of chemical distributions in near real time. Dr. Sonia Batten (Sir Alister Hardy Foundation for Ocean Sciences) provided an overview on the deployment of specialized towed vehicles equipped with rugged continuous plankton recorders on ships of opportunity to obtain long-term, basin wide surveys of zooplankton community structure, enhancing our understanding of trends in secondary production in the upper ocean. [PDF contains 32 pages]

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents theories, analyses, and algorithms for detecting and estimating parameters of geospatial events with today's large, noisy sensor networks. A geospatial event is initiated by a significant change in the state of points in a region in a 3-D space over an interval of time. After the event is initiated it may change the state of points over larger regions and longer periods of time. Networked sensing is a typical approach for geospatial event detection. In contrast to traditional sensor networks comprised of a small number of high quality (and expensive) sensors, trends in personal computing devices and consumer electronics have made it possible to build large, dense networks at a low cost. The changes in sensor capability, network composition, and system constraints call for new models and algorithms suited to the opportunities and challenges of the new generation of sensor networks. This thesis offers a single unifying model and a Bayesian framework for analyzing different types of geospatial events in such noisy sensor networks. It presents algorithms and theories for estimating the speed and accuracy of detecting geospatial events as a function of parameters from both the underlying geospatial system and the sensor network. Furthermore, the thesis addresses network scalability issues by presenting rigorous scalable algorithms for data aggregation for detection. These studies provide insights to the design of networked sensing systems for detecting geospatial events. In addition to providing an overarching framework, this thesis presents theories and experimental results for two very different geospatial problems: detecting earthquakes and hazardous radiation. The general framework is applied to these specific problems, and predictions based on the theories are validated against measurements of systems in the laboratory and in the field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Smartphones and other powerful sensor-equipped consumer devices make it possible to sense the physical world at an unprecedented scale. Nearly 2 million Android and iOS devices are activated every day, each carrying numerous sensors and a high-speed internet connection. Whereas traditional sensor networks have typically deployed a fixed number of devices to sense a particular phenomena, community networks can grow as additional participants choose to install apps and join the network. In principle, this allows networks of thousands or millions of sensors to be created quickly and at low cost. However, making reliable inferences about the world using so many community sensors involves several challenges, including scalability, data quality, mobility, and user privacy.

This thesis focuses on how learning at both the sensor- and network-level can provide scalable techniques for data collection and event detection. First, this thesis considers the abstract problem of distributed algorithms for data collection, and proposes a distributed, online approach to selecting which set of sensors should be queried. In addition to providing theoretical guarantees for submodular objective functions, the approach is also compatible with local rules or heuristics for detecting and transmitting potentially valuable observations. Next, the thesis presents a decentralized algorithm for spatial event detection, and describes its use detecting strong earthquakes within the Caltech Community Seismic Network. Despite the fact that strong earthquakes are rare and complex events, and that community sensors can be very noisy, our decentralized anomaly detection approach obtains theoretical guarantees for event detection performance while simultaneously limiting the rate of false alarms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To compare the effects of music from different cultural environments (Guqin: Chinese music; piano: Western music) on crossmodal selective attention, behavioral and event-related potential (ERP) data in a standard two-stimulus visual oddball task were reco

Relevância:

30.00% 30.00%

Publicador:

Resumo:

After earthquakes, licensed inspectors use the established codes to assess the impact of damage on structural elements. It always takes them days to weeks. However, emergency responders (e.g. firefighters) must act within hours of a disaster event to enter damaged structures to save lives, and therefore cannot wait till an official assessment completes. This is a risk that firefighters have to take. Although Search and Rescue Organizations offer training seminars to familiarize firefighters with structural damage assessment, its effectiveness is hard to guarantee when firefighters perform life rescue and damage assessment operations together. Also, the training is not available to every firefighter. The authors therefore proposed a novel framework that can provide firefighters with a quick but crude assessment of damaged buildings through evaluating the visible damage on their critical structural elements (i.e. concrete columns in the study). This paper presents the first step of the framework. It aims to automate the detection of concrete columns from visual data. To achieve this, the typical shape of columns (long vertical lines) is recognized using edge detection and the Hough transform. The bounding rectangle for each pair of long vertical lines is then formed. When the resulting rectangle resembles a column and the material contained in the region of two long vertical lines is recognized as concrete, the region is marked as a concrete column surface. Real video/image data are used to test the method. The preliminary results indicate that concrete columns can be detected when they are not distant and have at least one surface visible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Networked control systems (NCSs) have attracted much attention in the past decade due to their many advantages and growing number of applications. Different than classic control systems, resources in NCSs, such as network bandwidth and communication energy, are often limited, which degrade the closed-loop system performance and may even cause the system to become unstable. Seeking a desired trade-off between the closed-loop system performance and the limited resources is thus one heated area of research. In this paper, we analyze the trade-off between the sensor-to-controller communication rate and the closed-loop system performance indexed by the conventional LQG control cost. We present and compare several sensor data schedules, and demonstrate that two event-based sensor data schedules provide better trade-off than an optimal offline schedule. Simulation examples are provided to illustrate the theories developed in the paper. © 2012 AACC American Automatic Control Council).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiple type I interferons (IFNs) have recently been identified in salmonids, containing two or four conserved cysteines. In this work, a novel two-cysteine containing (2C) IFN gene was identified in rainbow trout. This novel trout IFN gene (termed IFN5) formed a phylogenetic group that is distinct from the other three salmonid IFN groups sequenced to date and had a close evolutionary relationship with IFNs from advanced fish species. Our data demonstrate that two subgroups are apparent within each of the 2C and 4C type I IFNs, an evolutionary outcome possibly due to two rounds of genome duplication events that have occurred within teleosts. We have examined gene expression of the trout 2C type I IFN in cultured cells following stimulation with lipopolysaccharide, phytohaemagglutinin, polyI:C or recombinant IFN, or after transfection with polyI:C. The kinetics of gene expression was also studied after viral infection. Analysis of the regulatory elements in the IFN promoter region predicted several binding sites for key transcription factors that potentially play an important role in mediating IFN5 gene expression.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the vulnerabilities of single event effects (SEEs) simulated by heavy ions on ground and observed oil SJ-5 research satellite in space for static random access memories (SRAMs). A single event upset (SEU) prediction code has been used to estimate the proton-induced upset rates based oil the ground test curve of SEU cross-section versus heavy ion linear energy transfer (LET). The result agrees with that of the flight data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Caffeine, which specifically inhibits ATM/ATR kinases, efficiently abrogates the ionizing radiation (IR)-induced G2 arrest and increases the sensitivity of various tumor cells to IR. Mechanisms for the effect of caffeine remain to be elucidated. As a target of ATM/ATR kinases, BRCA1 becomes activated and phosphorylated in response to IR. Thus, in this work, we investigated the possible role of BRCA1 in the effect of caffeine on G2 checkpoint and observed how BRCA1 phosphorylation was regulated in this process. For these purposes, the BRCA1 protein level and the phosphorylation states were analyzed by Western blotting by using an antibody against BRCA1 and phospho-specific antibodies against Ser-1423 and Ser-1524 residues in cells exposed to a combination of IR and caffeine. The results showed that caffeine down-regulated IR-induced BRCA1 expression and specifically abolished BRCA1 phosphorylation of Ser-1524, which was followed by an override of G2 arrest by caffeine. In addition, the ability of BRCA1 to transactivate p21 may be required for MCF-7 but not necessary for Hela response to caffeine. These data suggest that BRCA1 may be a potential target of caffeine. BRCA1 and its phosphorylation are most likely to be involved in the caffeine-inhibitable event upstream of G2 arrest.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The cold-water event along the southeast coast of the United States in the summer of 2003 is studied using satellite data combined with in situ observations. The analysis suggests that the cooling is produced by wind-driven coastal upwelling, which breaks the thermocline barrier in the summer of 2003. The strong and persistent southwesterly winds in the summer of 2003 play an important role of lifting the bottom isotherms up to the surface and away from the coast, generating persistent surface cooling in July-August 2003. Once the thermocline barrier is broken, the stratification in the nearshore region is weakened substantially, allowing further coastal cooling of large magnitudes by episodic southerly wind bursts or passage of coastally trapped waves at periods of a few days. These short-period winds or waves would otherwise have no effects on the surface temperature because of the strong thermocline barrier in summer if not for the low-frequency cooling produced by the persistent southwesterly winds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless sensor networks are characterized by limited energy resources. To conserve energy, application-specific aggregation (fusion) of data reports from multiple sensors can be beneficial in reducing the amount of data flowing over the network. Furthermore, controlling the topology by scheduling the activity of nodes between active and sleep modes has often been used to uniformly distribute the energy consumption among all nodes by de-synchronizing their activities. We present an integrated analytical model to study the joint performance of in-network aggregation and topology control. We define performance metrics that capture the tradeoffs among delay, energy, and fidelity of the aggregation. Our results indicate that to achieve high fidelity levels under medium to high event reporting load, shorter and fatter aggregation/routing trees (toward the sink) offer the best delay-energy tradeoff as long as topology control is well coordinated with routing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To investigate the neural systems that contribute to the formation of complex, self-relevant emotional memories, dedicated fans of rival college basketball teams watched a competitive game while undergoing functional magnetic resonance imaging (fMRI). During a subsequent recognition memory task, participants were shown video clips depicting plays of the game, stemming either from previously-viewed game segments (targets) or from non-viewed portions of the same game (foils). After an old-new judgment, participants provided emotional valence and intensity ratings of the clips. A data driven approach was first used to decompose the fMRI signal acquired during free viewing of the game into spatially independent components. Correlations were then calculated between the identified components and post-scanning emotion ratings for successfully encoded targets. Two components were correlated with intensity ratings, including temporal lobe regions implicated in memory and emotional functions, such as the hippocampus and amygdala, as well as a midline fronto-cingulo-parietal network implicated in social cognition and self-relevant processing. These data were supported by a general linear model analysis, which revealed additional valence effects in fronto-striatal-insular regions when plays were divided into positive and negative events according to the fan's perspective. Overall, these findings contribute to our understanding of how emotional factors impact distributed neural systems to successfully encode dynamic, personally-relevant event sequences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The importance of patterns in constructing complex systems has long been recognised in other disciplines. In software engineering, for example, well-crafted object-oriented architectures contain several design patterns. Focusing on mechanisms of constructing software during system development can yield an architecture that is simpler, clearer and more understandable than if design patterns were ignored or not properly applied. In this paper, we propose a model that uses object-oriented design patterns to develop a core bitemporal conceptual model. We define three core design patterns that form a core bitemporal conceptual model of a typical bitemporal object. Our framework is known as the Bitemporal Object, State and Event Modelling Approach (BOSEMA) and the resulting core model is known as a Bitemporal Object, State and Event (BOSE) model. Using this approach, we demonstrate that we can enrich data modelling by using well known design patterns which can help designers to build complex models of bitemporal databases.