999 resultados para 13627-012
Resumo:
The majority of the traffic (bytes) flowing over the Internet today have been attributed to the Transmission Control Protocol (TCP). This strong presence of TCP has recently spurred further investigations into its congestion avoidance mechanism and its effect on the performance of short and long data transfers. At the same time, the rising interest in enhancing Internet services while keeping the implementation cost low has led to several service-differentiation proposals. In such service-differentiation architectures, much of the complexity is placed only in access routers, which classify and mark packets from different flows. Core routers can then allocate enough resources to each class of packets so as to satisfy delivery requirements, such as predictable (consistent) and fair service. In this paper, we investigate the interaction among short and long TCP flows, and how TCP service can be improved by employing a low-cost service-differentiation scheme. Through control-theoretic arguments and extensive simulations, we show the utility of isolating TCP flows into two classes based on their lifetime/size, namely one class of short flows and another of long flows. With such class-based isolation, short and long TCP flows have separate service queues at routers. This protects each class of flows from the other as they possess different characteristics, such as burstiness of arrivals/departures and congestion/sending window dynamics. We show the benefits of isolation, in terms of better predictability and fairness, over traditional shared queueing systems with both tail-drop and Random-Early-Drop (RED) packet dropping policies. The proposed class-based isolation of TCP flows has several advantages: (1) the implementation cost is low since it only requires core routers to maintain per-class (rather than per-flow) state; (2) it promises to be an effective traffic engineering tool for improved predictability and fairness for both short and long TCP flows; and (3) stringent delay requirements of short interactive transfers can be met by increasing the amount of resources allocated to the class of short flows.
Resumo:
A human-computer interface (HCI) system designed for use by people with severe disabilities is presented. People that are severely paralyzed or afflicted with diseases such as ALS (Lou Gehrig's disease) or multiple sclerosis are unable to move or control any parts of their bodies except for their eyes. The system presented here detects the user's eye blinks and analyzes the pattern and duration of the blinks, using them to provide input to the computer in the form of a mouse click. After the automatic initialization of the system occurs from the processing of the user's involuntary eye blinks in the first few seconds of use, the eye is tracked in real time using correlation with an online template. If the user's depth changes significantly or rapid head movement occurs, the system is automatically reinitialized. There are no lighting requirements nor offline templates needed for the proper functioning of the system. The system works with inexpensive USB cameras and runs at a frame rate of 30 frames per second. Extensive experiments were conducted to determine both the system's accuracy in classifying voluntary and involuntary blinks, as well as the system's fitness in varying environment conditions, such as alternative camera placements and different lighting conditions. These experiments on eight test subjects yielded an overall detection accuracy of 95.3%.
Resumo:
It is useful in systems that must support multiple applications with various temporal requirements to allow application-specific policies to manage resources accordingly. However, there is a tension between this goal and the desire to control and police possibly malicious programs. The Java-based Sensor Execution Environment (SXE) in snBench presents a situation where such considerations add value to the system. Multiple applications can be run by multiple users with varied temporal requirements, some Real-Time and others best effort. This paper outlines and documents an implementation of a hierarchical and configurable scheduling system with which different applications can be executed using application-specific scheduling policies. Concurrently the system administrator can define fairness policies between applications that are imposed upon the system. Additionally, to ensure forward progress of system execution in the face of malicious or malformed user programs, an infrastructure for execution using multiple threads is described.
Resumo:
Lehar's lively discussion builds on a critique of neural models of vision that is incorrect in its general and specific claims. He espouses a Gestalt perceptual approach, rather than one consistent with the "objective neurophysiological state of the visual system" (p. 1). Contemporary vision models realize his perceptual goals and also quantitatively explain neurophysiological and anatomical data.
Resumo:
Lewis proposes "reconceptualization" (p. 1) of how to link the psychology and neurobiology of emotion and cognitive-emotional interactions. His main proposed themes have actually been actively and quantitatively developed in the neural modeling literature for over thirty years. This commentary summarizes some of these themes and points to areas of particularly active research in this area.
Resumo:
Before choosing, it helps to know both the expected value signaled by a predictive cue and the associated uncertainty that the reward will be forthcoming. Recently, Fiorillo et al. (2003) found the dopamine (DA) neurons of the SNc exhibit sustained responses related to the uncertainty that a cure will be followed by reward, in addition to phasic responses related to reward prediction errors (RPEs). This suggests that cue-dependent anticipations of the timing, magnitude, and uncertainty of rewards are learned and reflected in components of the DA signals broadcast by SNc neurons. What is the minimal local circuit model that can explain such multifaceted reward-related learning? A new computational model shows how learned uncertainty responses emerge robustly on single trial along with phasic RPE responses, such that both types of DA responses exhibit the empirically observed dependence on conditional probability, expected value of reward, and time since onset of the reward-predicting cue. The model includes three major pathways for computing: immediate expected values of cures, timed predictions of reward magnitudes (and RPEs), and the uncertainty associated with these predictions. The first two model pathways refine those previously modeled by Brown et al. (1999). A third, newly modeled, pathway is formed by medium spiny projection neurons (MSPNs) of the matrix compartment of the striatum, whose axons co-release GABA and a neuropeptide, substance P, both at synapses with GABAergic neurons in the SNr and with the dendrites (in SNr) of DA neurons whose somas are in ventral SNc. Co-release enables efficient computation of sustained DA uncertainty responses that are a non-monotonic function of the conditonal probability that a reward will follow the cue. The new model's incorporation of a striatal microcircuit allowed it to reveals that variability in striatal cholinergic transmission can explain observed difference, between monkeys, in the amplitutude of the non-monotonic uncertainty function. Involvement of matriceal MSPNs and striatal cholinergic transmission implpies a relation between uncertainty in the cue-reward contigency and action-selection functions of the basal ganglia. The model synthesizes anatomical, electrophysiological and behavioral data regarding the midbrain DA system in a novel way, by relating the ability to compute uncertainty, in parallel with other aspects of reward contingencies, to the unique distribution of SP inputs in ventral SN.
Resumo:
Financial time series convey the decisions and actions of a population of human actors over time. Econometric and regressive models have been developed in the past decades for analyzing these time series. More recently, biologically inspired artificial neural network models have been shown to overcome some of the main challenges of traditional techniques by better exploiting the non-linear, non-stationary, and oscillatory nature of noisy, chaotic human interactions. This review paper explores the options, benefits, and weaknesses of the various forms of artificial neural networks as compared with regression techniques in the field of financial time series analysis.
Resumo:
In this paper, we introduce the Generalized Equality Classifier (GEC) for use as an unsupervised clustering algorithm in categorizing analog data. GEC is based on a formal definition of inexact equality originally developed for voting in fault tolerant software applications. GEC is defined using a metric space framework. The only parameter in GEC is a scalar threshold which defines the approximate equality of two patterns. Here, we compare the characteristics of GEC to the ART2-A algorithm (Carpenter, Grossberg, and Rosen, 1991). In particular, we show that GEC with the Hamming distance performs the same optimization as ART2. Moreover, GEC has lower computational requirements than AR12 on serial machines.
Resumo:
How do our brains transform the "blooming buzzing confusion" of daily experience into a coherent sense of self that can learn and selectively attend to important information? How do local signals at multiple processing stages, none of which has a global view of brain dynamics or behavioral outcomes, trigger learning at multiple synaptic sites when appropriate, and prevent learning when inappropriate, to achieve useful behavioral goals in a continually changing world? How does the brain allow synaptic plasticity at a remarkably rapid rate, as anyone who has gone to an exciting movie is readily aware, yet also protect useful memories from catastrophic forgetting? A neural model provides a unified answer by explaining and quantitatively simulating data about single cell biophysics and neurophysiology, laminar neuroanatomy, aggregate cell recordings (current-source densities, local field potentials), large-scale oscillations (beta, gamma), and spike-timing dependent plasticity, and functionally linking them all to cognitive information processing requirements.
Resumo:
A neural network model, called an FBF network, is proposed for automatic parallel separation of multiple image figures from each other and their backgrounds in noisy grayscale or multi-colored images. The figures can then be processed in parallel by an array of self-organizing Adaptive Resonance Theory (ART) neural networks for automatic target recognition. An FBF network can automatically separate the disconnected but interleaved spirals that Minsky and Papert introduced in their book Perceptrons. The network's design also clarifies why humans cannot rapidly separate interleaved spirals, yet can rapidly detect conjunctions of disparity and color, or of disparity and motion, that distinguish target figures from surrounding distractors. Figure-ground separation is accomplished by iterating operations of a Feature Contour System (FCS) and a Boundary Contour System (BCS) in the order FCS-BCS-FCS, hence the term FBF, that have been derived from an analysis of biological vision. The FCS operations include the use of nonlinear shunting networks to compensate for variable illumination and nonlinear diffusion networks to control filling-in. A key new feature of an FBF network is the use of filling-in for figure-ground separation. The BCS operations include oriented filters joined to competitive and cooperative interactions designed to detect, regularize, and complete boundaries in up to 50 percent noise, while suppressing the noise. A modified CORT-X filter is described which uses both on-cells and off-cells to generate a boundary segmentation from a noisy image.
Resumo:
A dynamic distributed model is presented that reproduces the dynamics of a wide range of varied battle scenarios with a general and abstract representation. The model illustrates the rich dynamic behavior that can be achieved from a simple generic model.
Resumo:
Blanket bog lakes are a characteristic feature of blanket bog habitats and harbour many rare and threatened invertebrate species. Despite their potential conservation value, however, very little is known about their physico-chemical or biological characteristics in western Europe, and their reference conditions are still unknown in Ireland. Furthermore, they are under considerable threat in Ireland from a number of sources, particularly afforestation of their catchments by exotic conifers. Plantation forestry can potentially lead to the increased input of substances including hydrogen ions (H+), plants nutrients, dissolved organic carbon (DOC), heavy metals and sediment. The aims of this study were to investigate the effect of conifer plantation forestry on the hydrochemistry and ecology of blanket bog lakes in western Ireland. Lake hydrochemistry, littoral Chydoridae (Cladocera) and littoral macroinvertebrate communities were compared among replicate lakes selected from three distinct catchment land use categories: i) unplanted blanket bog only present in the catchment, ii) mature (closed-canopy) conifer plantation forests only present in the catchment and iii) catchments containing mature conifer plantation forests with recently clearfelled areas. All three catchment land uses were replicated across two geologies: sandstone and granite. Lakes with afforested catchments across both geologies had elevated concentrations of phosphorus (P), nitrogen (N), total dissolved organic carbon (TDOC), aluminium (Al) and iron (Fe), with the highest concentrations of each parameter recorded from lakes with catchment clearfelling. Dissolved oxygen concentrations were also significantly reduced in the afforested lakes, particularly the clearfell lakes. This change in lake hydrochemistry was associated with profound changes in lake invertebrate communities. Within the chydorid communities, the dominance of Alonopsis elongata in the unplanted blanket bog lakes shifted to dominance by the smaller bodied Chydorus sphaericus, along with Alonella nana, Alonella excisa and Alonella exigua, in the plantation forestry-affected lakes, consistent with a shift in lake trophy. Similarly, there was marked changes in the macroinvertebrate communities, especially for the Coleoptera and Heteroptera assemblages which revealed increased taxon richness and abundance in the nutrient-enriched lakes. In terms of conservation status, despite having the greatest species-quality scores (SQS) and species richness, three of the four International Union for the Conservation of Nature (IUCN) red-listed species of Coleoptera and Odonata recorded during the study were absent from lakes subject to catchment clearfelling. The relative strengths of bottom-up (forestry-mediated nutrient enrichment) and top-down (fish) forces in structuring littoral macroinvertebrate communities was investigated in a separate study. Nutrient enrichment was shown to be the dominant force acting on communities, with fish having a lesser influence. These results confirmed that plantation forestry poses the single greatest threat to the conservation status of blanket bog lakes in western Ireland. The findings of this study have major implications for the management of afforested peatlands. Further research is required on blanket bog lakes to prevent any further plantation forestry-mediated habitat deterioration of this rare and protected habitat.
Resumo:
PURPOSE: Evaluating genetic susceptibility may clarify effects of known environmental factors and also identify individuals at high risk. We evaluated the association of four insulin-related pathway gene polymorphisms in insulin-like growth factor-1 (IGF-I) (CA)( n ) repeat, insulin-like growth factor-2 (IGF-II) (rs680), insulin-like growth factor-binding protein-3 (IGFBP-3) (rs2854744), and adiponectin (APM1 rs1501299) with colon cancer risk, as well as relationships with circulating IGF-I, IGF-II, IGFBP-3, and C-peptide in a population-based study. METHODS: Participants were African Americans (231 cases and 306 controls) and Whites (297 cases, 530 controls). Consenting subjects provided blood specimens and lifestyle/diet information. Genotyping for all genes except IGF-I was performed by the 5'-exonuclease (Taqman) assay. The IGF-I (CA)(n) repeat was assayed by PCR and fragment analysis. Circulating proteins were measured by enzyme immunoassays. Odds ratios (ORs) and 95 % confidence intervals (CIs) were calculated by logistic regression. RESULTS: The IGF-I (CA)( 19 ) repeat was higher in White controls (50 %) than African American controls (31 %). Whites homozygous for the IGF-I (CA)(19) repeat had a nearly twofold increase in risk of colon cancer (OR = 1.77; 95 % CI = 1.15-2.73), but not African Americans (OR = 0.73, 95 % CI 0.50-1.51). We observed an inverse association between the IGF-II Apa1 A-variant and colon cancer risk (OR = 0.49, 95 % CI 0.28-0.88) in Whites only. Carrying the IGFBP-3 variant alleles was associated with lower IGFBP-3 protein levels, a difference most pronounced in Whites (p-trend <0.05). CONCLUSIONS: These results support an association between insulin pathway-related genes and elevated colon cancer risk in Whites but not in African Americans.
Resumo:
Addressing global fisheries overexploitation requires better understanding of how small-scale fishing communities in developing countries limit access to fishing grounds. We analyze the performance of a system based on individual licenses and a common property-rights regime in their ability to generate incentives for self-governance and conservation of fishery resources. Using a qualitative before-after-control-impact approach, we compare two neighbouring fishing communities in the Gulf of California, Mexico. Both were initially governed by the same permit system, are situated in the same ecosystem, use similar harvesting technology, and have overharvested similar species. One community changed to a common property-right regime, enabling the emergence of access controls and avoiding overexploitation of benthic resources, while the other community, still relies on the permit system. We discuss the roles played by power, institutions, socio-historic, and biophysical factors to develop access controls. © 2012 The Author(s).