981 resultados para Hardware-in-the-Loop
Resumo:
In the primary visual cortex, neurons with similar physiological features are clustered together in columns extending through all six cortical layers. These columns form modular orientation preference maps. Long-range lateral fibers are associated to the structure of orientation maps since they do not connect columns randomly; they rather cluster in regular intervals and interconnect predominantly columns of neurons responding to similar stimulus features. Single orientation preference maps – the joint activation of domains preferring the same orientation - were observed to emerge spontaneously and it was speculated whether this structured ongoing activation could be caused by the underlying patchy lateral connectivity. Since long-range lateral connections share many features, i.e. clustering, orientation selectivity, with visual inter-hemispheric connections (VIC) through the corpus callosum we used the latter as a model for long-range lateral connectivity. In order to address the question of how the lateral connectivity contributes to spontaneously generated maps of one hemisphere we investigated how these maps react to the deactivation of VICs originating from the contralateral hemisphere. To this end, we performed experiments in eight adult cats. We recorded voltage-sensitive dye (VSD) imaging and electrophysiological spiking activity in one brain hemisphere while reversible deactivating the other hemisphere with a cooling technique. In order to compare ongoing activity with evoked activity patterns we first presented oriented gratings as visual stimuli. Gratings had 8 different orientations distributed equally between 0º and 180º. VSD imaged frames obtained during ongoing activity conditions were then compared to the averaged evoked single orientation maps in three different states: baseline, cooling and recovery. Kohonen self-organizing maps were also used as a means of analysis without prior assumption (like the averaged single condition maps) on ongoing activity. We also evaluated if cooling had a differential effect on evoked and ongoing spiking activity of single units. We found that deactivating VICs caused no spatial disruption on the structure of either evoked or ongoing activity maps. The frequency with which a cardinally preferring (0º or 90º) map would emerge, however, decreased significantly for ongoing but not for evoked activity. The same result was found by training self-organizing maps with recorded data as input. Spiking activity of cardinally preferring units also decreased significantly for ongoing when compared to evoked activity. Based on our results we came to the following conclusions: 1) VICs are not a determinant factor of ongoing map structure. Maps continued to be spontaneously generated with the same quality, probably by a combination of ongoing activity from local recurrent connections, thalamocortical loop and feedback connections. 2) VICs account for a cardinal bias in the temporal sequence of ongoing activity patterns, i.e. deactivating VIC decreases the probability of cardinal maps to emerge spontaneously. 3) Inter- and intrahemispheric long-range connections might serve as a grid preparing primary visual cortex for likely junctions in a larger visual environment encompassing the two hemifields.
Resumo:
Cumulon is a system aimed at simplifying the development and deployment of statistical analysis of big data in public clouds. Cumulon allows users to program in their familiar language of matrices and linear algebra, without worrying about how to map data and computation to specific hardware and cloud software platforms. Given user-specified requirements in terms of time, monetary cost, and risk tolerance, Cumulon automatically makes intelligent decisions on implementation alternatives, execution parameters, as well as hardware provisioning and configuration settings -- such as what type of machines and how many of them to acquire. Cumulon also supports clouds with auction-based markets: it effectively utilizes computing resources whose availability varies according to market conditions, and suggests best bidding strategies for them. Cumulon explores two alternative approaches toward supporting such markets, with different trade-offs between system and optimization complexity. Experimental study is conducted to show the efficiency of Cumulon's execution engine, as well as the optimizer's effectiveness in finding the optimal plan in the vast plan space.
Resumo:
Quantitative information on metazoan meiofaunal abundance and biomass was obtained from three continental shelf (at 40, 100 and 200 m depth) and four deep-sea stations (at 540, 700, 940 and 1540 m depth) in the Cretan Sea (South Aegean Sea, NE Mediterranean). Samples were collected on a seasonal basis (from August 1994 to September 1995) with the use of a multiple corer. Meiofaunal abundance and biomass on the continental shelf of the Cretan Sea were high, in contrast to the extremely low values reported for the bathyal sediments that showed values comparable to those reported for abyssal and hadal environments. In order to explain the spatial and seasonal changes in metazoan meiofauna these data were compared with: (1) the concentrations of 'food indicators' (such as proteins, lipids, soluble carbohydrates and CPE) (2) the bacterial biomass (3) the flux of labile organic compounds to the sea floor at a fixed station (D7, 1540 m depth). Highly significant relationships between meiofaunal parameters and CPE, protein and lipid concentrations and bacterial biomass were found. Most of the indicators of food quality and quantity (such as CPE, proteins and carbohydrates) showed a clear seasonality with highest values in February and lowest in September. Such changes were more evident on the continental shelf rather than at deeper depths. On the continental shelf, significant seasonal changes in meiofaunal density were related to changes in the input of labile organic carbon whereas meiofaunal assemblages on the deep-sea stations showed time-lagged changes in response to the food input recorded in February 95. At all deep-sea stations meiofaunal density increased with a time lag of 2 months. Indications for a time-lagged meiofaunal response to the food inputs were also provided by the increase in nauplii densities during May 95 and the increase in individual biomass of nematodes, copepods and polychaetes between February and May 1995. The lack of strong seasonal changes in deep sea meiofaunal density suggests that the supply of organic matter below 500 m is not strong enough to support a significant meiofaunal development. Below 700 m depth >92% of the total biomass in the sediment was represented by bacteria. The ratio of bacterial to meiofaunal biomass increased with increasing water depth indicating that bacteria are probably more effective than meiofauna in exploiting refractory organic compounds. These data lead us to hypothesise that the deep-sea sediments of the Cretan Sea are largely dependent upon a benthic microbial loop.
Resumo:
Composition and accumulation rates of organic carbon in Holocene sediments provided data to calculate an organic carbon budget for the Laptev Sea continental margin. Mean Holocene accumulation rates in the inner Laptev Sea vary between 0.14 and 2.7 g C cm**2/ky; maximum values occur close to the Lena River delta. Seawards, the mean accumulation rates decrease from 0.43 to 0.02 g C cm**2/ky. The organic matter is predominantly of terrigenous origin. About 0.9*10**6 t/year of organic carbon are buried in the Laptev Sea, and 0.25*10**6 t/year on the continental slope. Between about 8.5 and 9 ka, major changes in supply of terrigenous and marine organic carbon occur, related to changes in coastal erosion, Siberian river discharge, and/or Atlantic water inflow along the Eurasian continental margin.
Resumo:
In this paper we advocate the Loop-of-stencil-reduce pattern as a way to simplify the parallel programming of heterogeneous platforms (multicore+GPUs). Loop-of-Stencil-reduce is general enough to subsume map, reduce, map-reduce, stencil, stencil-reduce, and, crucially, their usage in a loop. It transparently targets (by using OpenCL) combinations of CPU cores and GPUs, and it makes it possible to simplify the deployment of a single stencil computation kernel on different GPUs. The paper discusses the implementation of Loop-of-stencil-reduce within the FastFlow parallel framework, considering a simple iterative data-parallel application as running example (Game of Life) and a highly effective parallel filter for visual data restoration to assess performance. Thanks to the high-level design of the Loop-of-stencil-reduce, it was possible to run the filter seamlessly on a multicore machine, on multi-GPUs, and on both.
Resumo:
The astonishing development of diverse and different hardware platforms is twofold: on one side, the challenge for the exascale performance for big data processing and management; on the other side, the mobile and embedded devices for data collection and human machine interaction. This drove to a highly hierarchical evolution of programming models. GVirtuS is the general virtualization system developed in 2009 and firstly introduced in 2010 enabling a completely transparent layer among GPUs and VMs. This paper shows the latest achievements and developments of GVirtuS, now supporting CUDA 6.5, memory management and scheduling. Thanks to the new and improved remoting capabilities, GVirtus now enables GPU sharing among physical and virtual machines based on x86 and ARM CPUs on local workstations,computing clusters and distributed cloud appliances.
Resumo:
Adobe's Acrobat software, released in June 1993, is based around a new Portable Document Format (PDF) which offers the possibility of being able to view and exchange electronic documents, independent of the originating software, across a wide variety of supported hardware platforms (PC, Macintosh, Sun UNIX etc.). The fact that Acrobat's imageable objects are rendered with full use of Level 2 PostScript means that the most demanding requirements can be met in terms of high-quality typography and device-independent colour. These qualities will be very desirable components in future multimedia and hypermedia systems. The current capabilities of Acrobat and PDF are described; in particular the presence of hypertext links, bookmarks, and yellow sticker annotations (in release 1.0) together with article threads and multi-media plugins in version 2.0, This article also describes the CAJUN project (CD-ROM Acrobat Journals Using Networks) which has been investigating the automated placement of PDF hypertextual features from various front-end text processing systems. CAJUN has also been experimenting with the dissemination of PDF over e-mail, via World Wide Web and on CDROM.
Resumo:
Selected papers from the 3rd Edition of the International Conference on Wastes: Solutions, Treatments and Opportunities
Resumo:
Turbulent fluctuations in the vicinity of the water free surface along a flat, vertically oriented surface-piercing plate are studied experimentally using a laboratory-scale experiment. In this experiment, a meter-wide stainless steel belt travels horizontally in a loop around two rollers with vertically oriented axes, which are separated by 7.5 meters. This belt device is mounted inside a large water tank with the water level set just below the top edge of the belt. The belt, rollers, and supporting frame are contained within a sheet metal box to keep the device dry except for one 6-meter-long straight test section between rollers. The belt is launched from rest with an acceleration of up to 3-g in order to quickly reach steady state velocity. This creates a temporally evolving boundary layer analogous to the spatially evolving boundary layer created along a flat-sided ship moving at the same velocity, with a length equivalent to the length of belt that has passed the measurement region since the belt motion began. Surface profile measurements in planes normal to the belt surface are conducted using cinematic Laser Induced Fluorescence and quantitative surface profiles are extracted at each instant in time. Using these measurements, free surface fluctuations are examined and the propagation behavior of these free surface ripples is studied. It is found that free surface fluctuations are generated in a region close to the belt surface, where sub-surface velocity fluctuations influence the behavior of these free surface features. These rapidly-changing surface features close to the belt appear to lead to the generation of freely-propagating waves far from the belt, outside the influence of the boundary layer. Sub-surface PIV measurements are performed in order to study the modification of the boundary layer flow field due to the effects of the water free surface. Cinematic planar PIV measurements are performed in horizontal planes parallel to the free surface by imaging the flow from underneath the tank, providing streamwise and wall-normal velocity fields. Additional planar PIV experiments are performed in vertical planes parallel to the belt surface in order to study the bahvior of streamwise and vertical velocity fields. It is found that the boundary layer grows rapidly near the free surface, leading to an overall thicker boundary layer close to the surface. This rapid boundary layer growth appears to be linked to a process of free surface bursting, the sudden onset of free surface fluctuations. Cinematic white light movies are recorded from beneath the water surface in order to determine the onset location of air entrainment. In addition, qualitative observations of these processes are made in order to determine the mechanisms leading to air entrainment present in this flow.
Resumo:
Early water resources modeling efforts were aimed mostly at representing hydrologic processes, but the need for interdisciplinary studies has led to increasing complexity and integration of environmental, social, and economic functions. The gradual shift from merely employing engineering-based simulation models to applying more holistic frameworks is an indicator of promising changes in the traditional paradigm for the application of water resources models, supporting more sustainable management decisions. This dissertation contributes to application of a quantitative-qualitative framework for sustainable water resources management using system dynamics simulation, as well as environmental systems analysis techniques to provide insights for water quality management in the Great Lakes basin. The traditional linear thinking paradigm lacks the mental and organizational framework for sustainable development trajectories, and may lead to quick-fix solutions that fail to address key drivers of water resources problems. To facilitate holistic analysis of water resources systems, systems thinking seeks to understand interactions among the subsystems. System dynamics provides a suitable framework for operationalizing systems thinking and its application to water resources problems by offering useful qualitative tools such as causal loop diagrams (CLD), stock-and-flow diagrams (SFD), and system archetypes. The approach provides a high-level quantitative-qualitative modeling framework for "big-picture" understanding of water resources systems, stakeholder participation, policy analysis, and strategic decision making. While quantitative modeling using extensive computer simulations and optimization is still very important and needed for policy screening, qualitative system dynamics models can improve understanding of general trends and the root causes of problems, and thus promote sustainable water resources decision making. Within the system dynamics framework, a growth and underinvestment (G&U) system archetype governing Lake Allegan's eutrophication problem was hypothesized to explain the system's problematic behavior and identify policy leverage points for mitigation. A system dynamics simulation model was developed to characterize the lake's recovery from its hypereutrophic state and assess a number of proposed total maximum daily load (TMDL) reduction policies, including phosphorus load reductions from point sources (PS) and non-point sources (NPS). It was shown that, for a TMDL plan to be effective, it should be considered a component of a continuous sustainability process, which considers the functionality of dynamic feedback relationships between socio-economic growth, land use change, and environmental conditions. Furthermore, a high-level simulation-optimization framework was developed to guide watershed scale BMP implementation in the Kalamazoo watershed. Agricultural BMPs should be given priority in the watershed in order to facilitate cost-efficient attainment of the Lake Allegan's TP concentration target. However, without adequate support policies, agricultural BMP implementation may adversely affect the agricultural producers. Results from a case study of the Maumee River basin show that coordinated BMP implementation across upstream and downstream watersheds can significantly improve cost efficiency of TP load abatement.
Resumo:
Despite Springer’s (1964) revision of the sharpnose sharks (genus Rhizoprionodon), the taxonomic definition and ranges of Rhizoprionodon in the western Atlantic Ocean remains problematic. In particular, the distinction between Rhizoprionodon terraenovae and R. porosus, and the occurrence of R. terraenovae in South American waters are unresolved issues involving common and ecologically important species in need of fishery management in Caribbean and southwest Atlantic waters. In recent years, molecular markers have been used as efficient tools for the detection of cryptic species and to address controversial taxonomic issues. In this study 415 samples of the genus Rhizoprionodon captured in the western Atlantic Ocean from Florida to southern Brazil were examined for sequences of the COI gene and the D-loop and evaluated for nucleotide differences. The results on nucleotide composition, AMOVA tests, and relationship distances using Bayesian-likelihood method and haplotypes network, corroborates Springer’s (1964) morphometric and meristic finding and provide strong evidence that supports consideration of R. terraenovae and R. porosus as distinct species.
Resumo:
The dissertation starts by providing a description of the phenomena related to the increasing importance recently acquired by satellite applications. The spread of such technology comes with implications, such as an increase in maintenance cost, from which derives the interest in developing advanced techniques that favor an augmented autonomy of spacecrafts in health monitoring. Machine learning techniques are widely employed to lay a foundation for effective systems specialized in fault detection by examining telemetry data. Telemetry consists of a considerable amount of information; therefore, the adopted algorithms must be able to handle multivariate data while facing the limitations imposed by on-board hardware features. In the framework of outlier detection, the dissertation addresses the topic of unsupervised machine learning methods. In the unsupervised scenario, lack of prior knowledge of the data behavior is assumed. In the specific, two models are brought to attention, namely Local Outlier Factor and One-Class Support Vector Machines. Their performances are compared in terms of both the achieved prediction accuracy and the equivalent computational cost. Both models are trained and tested upon the same sets of time series data in a variety of settings, finalized at gaining insights on the effect of the increase in dimensionality. The obtained results allow to claim that both models, combined with a proper tuning of their characteristic parameters, successfully comply with the role of outlier detectors in multivariate time series data. Nevertheless, under this specific context, Local Outlier Factor results to be outperforming One-Class SVM, in that it proves to be more stable over a wider range of input parameter values. This property is especially valuable in unsupervised learning since it suggests that the model is keen to adapting to unforeseen patterns.
Resumo:
The pervasive availability of connected devices in any industrial and societal sector is pushing for an evolution of the well-established cloud computing model. The emerging paradigm of the cloud continuum embraces this decentralization trend and envisions virtualized computing resources physically located between traditional datacenters and data sources. By totally or partially executing closer to the network edge, applications can have quicker reactions to events, thus enabling advanced forms of automation and intelligence. However, these applications also induce new data-intensive workloads with low-latency constraints that require the adoption of specialized resources, such as high-performance communication options (e.g., RDMA, DPDK, XDP, etc.). Unfortunately, cloud providers still struggle to integrate these options into their infrastructures. That risks undermining the principle of generality that underlies the cloud computing scale economy by forcing developers to tailor their code to low-level APIs, non-standard programming models, and static execution environments. This thesis proposes a novel system architecture to empower cloud platforms across the whole cloud continuum with Network Acceleration as a Service (NAaaS). To provide commodity yet efficient access to acceleration, this architecture defines a layer of agnostic high-performance I/O APIs, exposed to applications and clearly separated from the heterogeneous protocols, interfaces, and hardware devices that implement it. A novel system component embodies this decoupling by offering a set of agnostic OS features to applications: memory management for zero-copy transfers, asynchronous I/O processing, and efficient packet scheduling. This thesis also explores the design space of the possible implementations of this architecture by proposing two reference middleware systems and by adopting them to support interactive use cases in the cloud continuum: a serverless platform and an Industry 4.0 scenario. A detailed discussion and a thorough performance evaluation demonstrate that the proposed architecture is suitable to enable the easy-to-use, flexible integration of modern network acceleration into next-generation cloud platforms.
Resumo:
Characterized for the first time in erythrocytes, phosphatidylinositol phosphate kinases (PIP kinases) belong to a family of enzymes that generate various lipid messengers and participate in several cellular processes, including gene expression regulation. Recently, the PIPKIIα gene was found to be differentially expressed in reticulocytes from two siblings with hemoglobin H disease, suggesting a possible relationship between PIPKIIα and the production of globins. Here, we investigated PIPKIIα gene and protein expression and protein localization in hematopoietic-derived cells during their differentiation, and the effects of PIPKIIα silencing on K562 cells. PIPKIIα silencing resulted in an increase in α and γ globins and a decrease in the proliferation of K562 cells without affecting cell cycle progression and apoptosis. In conclusion, using a cell line model, we showed that PIPKIIα is widely expressed in hematopoietic-derived cells, is localized in their cytoplasm and nucleus, and is upregulated during erythroid differentiation. We also showed that PIPKIIα silencing can induce α and γ globin expression and decrease cell proliferation in K562 cells.
Resumo:
Mine drainage is an important environmental disturbance that affects the chemical and biological components in natural resources. However, little is known about the effects of neutral mine drainage on the soil bacteria community. Here, a high-throughput 16S rDNA pyrosequencing approach was used to evaluate differences in composition, structure, and diversity of bacteria communities in samples from a neutral drainage channel, and soil next to the channel, at the Sossego copper mine in Brazil. Advanced statistical analyses were used to explore the relationships between the biological and chemical data. The results showed that the neutral mine drainage caused changes in the composition and structure of the microbial community, but not in its diversity. The Deinococcus/Thermus phylum, especially the Meiothermus genus, was in large part responsible for the differences between the communities, and was positively associated with the presence of copper and other heavy metals in the environmental samples. Other important parameters that influenced the bacterial diversity and composition were the elements potassium, sodium, nickel, and zinc, as well as pH. The findings contribute to the understanding of bacterial diversity in soils impacted by neutral mine drainage, and demonstrate that heavy metals play an important role in shaping the microbial population in mine environments.