972 resultados para Automated sorting system


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sampling was conducted from March 24 to August 5 2010, in the fjord branch Kapisigdlit located in the inner part of the Godthåbsfjord system, West Greenland. The vessel "Lille Masik" was used during all cruises except on June 17-18 where sampling was done from RV Dana (National Institute for Aquatic Resources, Denmark). A total of 15 cruises (of 1-2 days duration) 7-10 days apart was carried out along a transect composed of 6 stations (St.), spanning the length of the 26 km long fjord branch. St. 1 was located at the mouth of the fjord branch and St. 6 was located at the end of the fjord branch, in the middle of a shallower inner creek . St. 1-4 was covering deeper parts of the fjord, and St. 5 was located on the slope leading up to the shallow inner creek. Mesozooplankton was sampled by vertical net tows using a Hydrobios Multinet (type Mini) equipped with a flow meter and 50 µm mesh nets or a WP-2 net 50 µm mesh size equipped with a non-filtering cod-end. Sampling was conducted at various times of day at the different stations. The nets were hauled with a speed of 0.2-0.3 m s**-1 from 100, 75 and 50 m depth to the surface at St. 2 + 4, 5 and 6, respectively. The content was immediately preserved in buffered formalin (4% final concentration). All samples were analyzed in the Plankton sorting and identification center in Szczecin (www.nmfri.gdynia.pl). Samples containing high numbers of zooplankton were split into subsamples. All copepods and other zooplankton were identified down to lowest possible taxonomic level (approx. 400 per sample), length measured and counted. Copepods were sorted into development stages (nauplii stage 1 - copepodite stage 6) using morphological features and sizes, and up to 10 individuals of each stage was length measured.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sampling was conducted from March 24 to August 5 2010, in the fjord branch Kapisigdlit located in the inner part of the Godthåbsfjord system, West Greenland. The vessel "Lille Masik" was used during all cruises except on June 17-18 where sampling was done from RV Dana (National Institute for Aquatic Resources, Denmark). A total of 15 cruises (of 1-2 days duration) 7-10 days apart was carried out along a transect composed of 6 stations (St.), spanning the length of the 26 km long fjord branch. St. 1 was located at the mouth of the fjord branch and St. 6 was located at the end of the fjord branch, in the middle of a shallower inner creek . St. 1-4 was covering deeper parts of the fjord, and St. 5 was located on the slope leading up to the shallow inner creek. Mesozooplankton was sampled by vertical net tows using a Hydrobios Multinet (type Mini) equipped with a flow meter and 50 µm mesh nets or a WP-2 net 50 µm mesh size equipped with a non-filtering cod-end. Sampling was conducted at various times of day at the different stations. The nets were hauled with a speed of 0.2-0.3 m s**-1 from 100, 75 and 50 m depth to the surface at St. 2 + 4, 5 and 6, respectively. The content was immediately preserved in buffered formalin (4% final concentration). All samples were analyzed in the Plankton sorting and identification center in Szczecin (www.nmfri.gdynia.pl). Samples containing high numbers of zooplankton were split into subsamples. All copepods and other zooplankton were identified down to lowest possible taxonomic level (approx. 400 per sample), length measured and counted. Copepods were sorted into development stages (nauplii stage 1 - copepodite stage 6) using morphological features and sizes, and up to 10 individuals of each stage was length measured.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although conventional sediment parameters (mean grain size, sorting, and skewness) and provenance have typically been used to infer sediment transport pathways, most freshwater, brackish, and marine environments are also characterized by abundant sediment constituents of biological, and possibly anthropogenic and volcanic, origin that can provide additional insight into local sedimentary processes. The biota will be spatially distributed according to its response to environmental parameters such as water temperature, salinity, dissolved oxygen, organic carbon content, grain size, and intensity of currents and tidal flow, whereas the presence of anthropogenic and volcanic constituents will reflect proximity to source areas and whether they are fluvially- or aerially-transported. Because each of these constituents have a unique environmental signature, they are a more precise proxy for that source area than the conventional sedimentary process indicators. This San Francisco Bay Coastal System study demonstrates that by applying a multi-proxy approach, the primary sites of sediment transport can be identified. Many of these sites are far from where the constituents originated, showing that sediment transport is widespread in the region. Although not often used, identifying and interpreting the distribution of naturally-occurring and allochthonous biologic, anthropogenic, and volcanic sediment constituents is a powerful tool to aid in the investigation of sediment transport pathways in other coastal systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Tara Oceans Expedition (2009-2013) sampled the world oceans on board a 36 m long schooner, collecting environmental data and organisms from viruses to planktonic metazoans for later analyses using modern sequencing and state-of-the-art imaging technologies. Tara Oceans Data are particularly suited to study the genetic, morphological and functional diversity of plankton. The present data set provides continuous measurements of partial pressure of carbon dioxide (pCO2), using a ProOceanus CO2-Pro instrument mounted on the flowthrough system. This automatic sensor is fitted with an equilibrator made of gas permeable silicone membrane and an internal detection loop with a non-dispersive infrared detector of PPSystems SBA-4 CO2 analyzer. A zero-CO2 baseline is provided for the subsequent measurements circulating the internal gas through a CO2 absorption chamber containing soda lime or Ascarite. The frequency of this automatic zero point calibration was set to be 24 hours. All data recorded during zeroing processes were discarded with the 15-minute data after each calibration. The output of CO2-Pro is the mole fraction of CO2 in the measured water and the pCO2 is obtained using the measured total pressure of the internal wet gas. The fugacity of CO2 (fCO2) in the surface seawater, whose difference with the atmospheric CO2 fugacity is proportional to the air-sea CO2 fluxes, is obtained by correcting the pCO2 for non-ideal CO2 gas concentration according to Weiss (1974). The fCO2 computed using CO2-Pro measurements was corrected to the sea surface condition by considering the temperature effect on fCO2 (Takahashi et al., 1993). The surface seawater observations that were initially estimated with a 15 seconds frequency were averaged every 5-min cycle. The performance of CO2-Pro was adjusted by comparing the sensor outputs against the thermodynamic carbonate calculation of pCO2 using the carbonic system constants of Millero et al. (2006) from the determinations of total inorganic carbon (CT ) and total alkalinity (AT ) in discrete samples collected at sea surface. AT was determined using an automated open cell potentiometric titration (Haraldsson et al. 1997). CT was determined with an automated coulometric titration (Johnson et al. 1985; 1987), using the MIDSOMMA system (Mintrop, 2005). fCO2 data are flagged according to the WOCE guidelines following Pierrot et al. (2009) identifying recommended values and questionable measurements giving additional information about the reasons of the questionability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Exploiting the full potential of telemedical systems means using platform based solutions: data are recovered from biomedical sensors, hospital information systems, care-givers, as well as patients themselves, and are processed and redistributed in an either centralized or, more probably, decentralized way. The integration of all these different devices, and interfaces, as well as the automated analysis and representation of all the pieces of information are current key challenges in telemedicine. Mobile phone technology has just begun to offer great opportunities of using this diverse information for guiding, warning, and educating patients, thus increasing their autonomy and adherence to their prescriptions. However, most of these existing mobile solutions are not based on platform systems and therefore represent limited, isolated applications. This article depicts how telemedical systems, based on integrated health data platforms, can maximize prescription adherence in chronic patients through mobile feedback. The application described here has been developed in an EU-funded R&D project called METABO, dedicated to patients with type 1 or type 2 Diabetes Mellitus

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a heterogeneous collaborative sensor network for electrical management in the residential sector. Improving demand-side management is very important in distributed energy generation applications. Sensing and control are the foundations of the “Smart Grid” which is the future of large-scale energy management. The system presented in this paper has been developed on a self-sufficient solar house called “MagicBox” equipped with grid connection, PV generation, lead-acid batteries, controllable appliances and smart metering. Therefore, there is a large number of energy variables to be monitored that allow us to precisely manage the energy performance of the house by means of collaborative sensors. The experimental results, performed on a real house, demonstrate the feasibility of the proposed collaborative system to reduce the consumption of electrical power and to increase energy efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An important competence of human data analysts is to interpret and explain the meaning of the results of data analysis to end-users. However, existing automatic solutions for intelligent data analysis provide limited help to interpret and communicate information to non-expert users. In this paper we present a general approach to generating explanatory descriptions about the meaning of quantitative sensor data. We propose a type of web application: a virtual newspaper with automatically generated news stories that describe the meaning of sensor data. This solution integrates a variety of techniques from intelligent data analysis into a web-based multimedia presentation system. We validated our approach in a real world problem and demonstrate its generality using data sets from several domains. Our experience shows that this solution can facilitate the use of sensor data by general users and, therefore, can increase the utility of sensor network infrastructures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the multi-agent organization of a computer system that was designed to assist operators in decision making in the presence of emergencies. The application was developed for the case of emergencies caused by river floods. It operates on real-time receiving data recorded by sensors (rainfall, water levels, flows, etc.) and applies multi-agent techniques to interpret the data, predict the future behavior and recommend control actions. The system includes an advanced knowledge based architecture with multiple symbolic representation with uncertainty models (bayesian networks). This system has been applied and validated at two particular sites in Spain (the Jucar basin and the South basin).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the advantages of social networks is the possibility to socialize and personalize the content created or shared by the users. In mobile social networks, where the devices have limited capabilities in terms of screen size and computing power, Multimedia Recommender Systems help to present the most relevant content to the users, depending on their tastes, relationships and profile. Previous recommender systems are not able to cope with the uncertainty of automated tagging and are knowledge domain dependant. In addition, the instantiation of a recommender in this domain should cope with problems arising from the collaborative filtering inherent nature (cold start, banana problem, large number of users to run, etc.). The solution presented in this paper addresses the abovementioned problems by proposing a hybrid image recommender system, which combines collaborative filtering (social techniques) with content-based techniques, leaving the user the liberty to give these processes a personal weight. It takes into account aesthetics and the formal characteristics of the images to overcome the problems of current techniques, improving the performance of existing systems to create a mobile social networks recommender with a high degree of adaptation to any kind of user.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes an automatic-dependent surveillance-broadcast (ADS-B) implementation for air-to-air and ground-based experimental surveillance within a prototype of a fully automated air traffic management (ATM) system, under a trajectory-based-operations paradigm. The system is built using an air-inclusive implementation of system wide information management (SWIM). This work describes the relations between airborne and ground surveillance (SURGND), the prototype surveillance systems, and their algorithms. System's performance is analyzed with simulated and real data. Results show that the proposed ADS-B implementation can fulfill the most demanding surveillance accuracy requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In just a few years cloud computing has become a very popular paradigm and a business success story, with storage being one of the key features. To achieve high data availability, cloud storage services rely on replication. In this context, one major challenge is data consistency. In contrast to traditional approaches that are mostly based on strong consistency, many cloud storage services opt for weaker consistency models in order to achieve better availability and performance. This comes at the cost of a high probability of stale data being read, as the replicas involved in the reads may not always have the most recent write. In this paper, we propose a novel approach, named Harmony, which adaptively tunes the consistency level at run-time according to the application requirements. The key idea behind Harmony is an intelligent estimation model of stale reads, allowing to elastically scale up or down the number of replicas involved in read operations to maintain a low (possibly zero) tolerable fraction of stale reads. As a result, Harmony can meet the desired consistency of the applications while achieving good performance. We have implemented Harmony and performed extensive evaluations with the Cassandra cloud storage on Grid?5000 testbed and on Amazon EC2. The results show that Harmony can achieve good performance without exceeding the tolerated number of stale reads. For instance, in contrast to the static eventual consistency used in Cassandra, Harmony reduces the stale data being read by almost 80% while adding only minimal latency. Meanwhile, it improves the throughput of the system by 45% while maintaining the desired consistency requirements of the applications when compared to the strong consistency model in Cassandra.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose the use of the "infotaxis" search strategy as the navigation system of a robotic platform, able to search and localize infectious foci by detecting the changes in the profile of volatile organic compounds emitted by and infected plant. We builded a simple and cost effective robot platform that substitutes odour sensors in favour of light sensors and study their robustness and performance under non ideal conditions such as the exitence of obstacles due to land topology or weeds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ntelligent systems designed to reduce highway fatalities have been widely applied in the automotive sector in the last decade. Of all users of transport systems, pedestrians are the most vulnerable in crashes as they are unprotected. This paper deals with an autonomous intelligent emergency system designed to avoid collisions with pedestrians. The system consists of a fuzzy controller based on the time-to-collision estimate – obtained via a vision-based system – and the wheel-locking probability – obtained via the vehicle’s CAN bus – that generates a safe braking action. The system has been tested in a real car – a convertible Citroën C3 Pluriel – equipped with an automated electro-hydraulic braking system capable of working in parallel with the vehicle’s original braking circuit. The system is used as a last resort in the case that an unexpected pedestrian is in the lane and all the warnings have failed to produce a response from the driver.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Applications that operate on meshes are very popular in High Performance Computing (HPC) environments. In the past, many techniques have been developed in order to optimize the memory accesses for these datasets. Different loop transformations and domain decompositions are com- monly used for structured meshes. However, unstructured grids are more challenging. The memory accesses, based on the mesh connectivity, do not map well to the usual lin- ear memory model. This work presents a method to improve the memory performance which is suitable for HPC codes that operate on meshes. We develop a method to adjust the sequence in which the data are used inside the algorithm, by means of traversing and sorting the mesh. This sorted mesh can be transferred sequentially to the lower memory levels and allows for minimum data transfer requirements. The method also reduces the lower memory requirements dra- matically: up to 63% of the L1 cache misses are removed in a traditional cache system. We have obtained speedups of up to 2.58 on memory operations as measured in a general- purpose CPU. An improvement is also observed with se- quential access memories, where we have observed reduc- tions of up to 99% in the required low-level memory size.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to the relative transparency of its embryos and larvae, the zebrafish is an ideal model organism for bioimaging approaches in vertebrates. Novel microscope technologies allow the imaging of developmental processes in unprecedented detail, and they enable the use of complex image-based read-outs for high-throughput/high-content screening. Such applications can easily generate Terabytes of image data, the handling and analysis of which becomes a major bottleneck in extracting the targeted information. Here, we describe the current state of the art in computational image analysis in the zebrafish system. We discuss the challenges encountered when handling high-content image data, especially with regard to data quality, annotation, and storage. We survey methods for preprocessing image data for further analysis, and describe selected examples of automated image analysis, including the tracking of cells during embryogenesis, heartbeat detection, identification of dead embryos, recognition of tissues and anatomical landmarks, and quantification of behavioral patterns of adult fish. We review recent examples for applications using such methods, such as the comprehensive analysis of cell lineages during early development, the generation of a three-dimensional brain atlas of zebrafish larvae, and high-throughput drug screens based on movement patterns. Finally, we identify future challenges for the zebrafish image analysis community, notably those concerning the compatibility of algorithms and data formats for the assembly of modular analysis pipelines.