962 resultados para Effective Attention Allocation
Resumo:
Reduction of carbon emissions is of paramount importance in the context of global warming. Countries and global companies are now engaged in understanding systematic ways of achieving well defined emission targets. In fact, carbon credits have become significant and strategic instruments of finance for countries and global companies. In this paper, we formulate and suggest a solution to the carbon allocation problem, which involves determining a cost minimizing allocation of carbon credits among different emitting agents. We address this challenge in the context of a global company which is faced with the challenge of determining an allocation of carbon credit caps among its divisions in a cost effective way. The problem is formulated as a reverse auction problem where the company plays the role of a buyer or carbon planning authority and the different divisions within the company are the emitting agents that specify cost curves for carbon credit reductions. Two natural variants of the problem: (a) with unlimited budget and (b) with limited budget are considered. Suitable assumptions are made on the cost curves and in each of the two cases we show that the resulting problem formulation is a knapsack problem that can be solved optimally using a greedy heuristic. The solution of the allocation problem provides critical decision support to global companies engaged seriously in green programs.
Resumo:
Mobile ad-hoc networks (MANETs) have recently drawn significant research attention since they offer unique benefits and versatility with respect to bandwidth spatial reuse, intrinsic fault tolerance, and low-cost rapid deployment. This paper addresses the issue of delay sensitive realtime data transport in these type of networks. An effective QoS mechanism is thereby required for the speedy transport of the realtime data. QoS issue in MANET is an open-end problem. Various QoS measures are incorporated in the upperlayers of the network, but a few techniques addresses QoS techniques in the MAC layer. There are quite a few QoS techniques in the MAC layer for the infrastructure based wireless network. The goal and the challenge is to achieve a QoS delivery and a priority access to the real time traffic in adhoc wireless environment, while maintaining democracy in the resource allocation. We propose a MAC layer protocol called "FCP based FAMA protocol", which allocates the channel resources to the needy in a more democratic way, by examining the requirements, malicious behavior and genuineness of the request. We have simulated both the FAMA as well as FCP based FAMA and tested in various MANET conditions. Simulated results have clearly shown a performance improvement in the channel utilization and a decrease in the delay parameters in the later case. Our new protocol outperforms the other QoS aware MAC layer protocols.
Resumo:
In this paper, we present a novel approach that makes use of topic models based on Latent Dirichlet allocation(LDA) for generating single document summaries. Our approach is distinguished from other LDA based approaches in that we identify the summary topics which best describe a given document and only extract sentences from those paragraphs within the document which are highly correlated given the summary topics. This ensures that our summaries always highlight the crux of the document without paying any attention to the grammar and the structure of the documents. Finally, we evaluate our summaries on the DUC 2002 Single document summarization data corpus using ROUGE measures. Our summaries had higher ROUGE values and better semantic similarity with the documents than the DUC summaries.
Resumo:
Use of circular hexagonal honeycomb structures and tube assemblies in energy absorption systems has attracted a large number of literature on their characterization under crushing and impact loads. Notwithstanding these, effective shear moduli (G*) required for complete transverse elastic characterization and in analyses of hierarchical structures have received scant attention. In an attempt to fill this void, the present study undertakes to evaluate G* of a generalized circular honeycomb structures and tube assemblies in a diamond array structure (DAS) with no restriction on their thickness. These structures present a potential to realize a spectrum of moduli with minimal modifications, a point of relevance for manufactures and designers. To evaluate G* in this paper, models based on technical theories - thin ring theory and curved beam theory - and rigorous theory of elasticity are investigated and corroborated with FEA employing contact elements. Technical theories which give a good match for thin HCS offer compact expressions for moduli which can be harvested to study sensitivity of moduli on topology. On the other hand, elasticity model offers a very good match over a large range of thickness along with exact analysis of stresses by employing computationally efficient expressions. (C) 2015 Elsevier Ltd. All rights reserved.
Resumo:
In metropolitan cities, public transportation service plays a vital role in mobility of people, and it has to introduce new routes more frequently due to the fast development of the city in terms of population growth and city size. Whenever there is introduction of new route or increase in frequency of buses, the nonrevenue kilometers covered by the buses increases as depot and route starting/ending points are at different places. This non-revenue kilometers or dead kilometers depends on the distance between depot and route starting point/ending point. The dead kilometers not only results in revenue loss but also results in an increase in the operating cost because of the extra kilometers covered by buses. Reduction of dead kilometers is necessary for the economic growth of the public transportation system. Therefore, in this study, the attention is focused on minimizing dead kilometers by optimizing allocation of buses to depots depending upon the shortest distance between depot and route starting/ending points. We consider also depot capacity and time period of operation during allocation of buses to ensure parking safety and proper maintenance of buses. Mathematical model is developed considering the aforementioned parameters, which is a mixed integer program, and applied to Bangalore Metropolitan Transport Corporation (BMTC) routes operating presently in order to obtain optimal bus allocation to depots. Database for dead kilometers of depots in BMTC for all the schedules are generated using the Form-4 (trip sheet) of each schedule to analyze depot-wise and division-wise dead kilometers. This study also suggests alternative locations where depots can be located to reduce dead kilometers. Copyright (C) 2015 John Wiley & Sons, Ltd.
Resumo:
This paper sets out to assess the workability of the regulation currently in force in the European anchovy fishery of the VIII division. Particular attention is paid to the importance of the institutional regime in the allocation of natural resources. The study uses a bio-economic approach and takes into account the fact that, not only the European Union and the individual countries involved, but also some of the resource users or appropriators intervene in its management. In order to compare the effectiveness of the rules which, at the various levels, have been set up to restrict exploitation of the resource, the anchovy fishery is simulated in two extreme situations: open access and sole ownership. The results obtained by effective management will then be contrasted with those obtained from the maximum and zero profit objectives related with the two above-mentioned scenarios. Thus, if the real data come close to those derived from the sole ownership model it will have to be acknowledged that the rules at present in force are optimal. If, on the other hand, the situation more closely approach the results obtained from the open access model, we will endeavour in our conclusions to provide suggestions for economic policy measures that might improve the situation in the fishery.
Resumo:
The increasingly intense competition between commercial and recreational fishermen for access to fish stocks has focused attention on the economic implications of fishery allocations. Indeed, one can scarcely find a management plan or amendment that does not at least refer to the relative food and sport values of fish and to how expenditures by commercial and recreational fishermen on equipment and supplies stimulate the economy. However, many of the arguments raised by constituents to influence such allocations, while having an seemingly "economics" ring to them, are usually incomplete, distorted, and even incorrect. This report offers fishery managers and other interested parties a guide to correct notions of economic value and to the appropriate ways to characterize, estimate, and compare value. In particular, introductory material from benefitcost analysis and input-output analysis is described and illustrated. In the process, several familiar specious arguments are exposed.(PDF file contains 34 pages.)
Resumo:
This paper describes an experiment developed to study the performance of virtual agent animated cues within digital interfaces. Increasingly, agents are used in virtual environments as part of the branding process and to guide user interaction. However, the level of agent detail required to establish and enhance efficient allocation of attention remains unclear. Although complex agent motion is now possible, it is costly to implement and so should only be routinely implemented if a clear benefit can be shown. Pevious methods of assessing the effect of gaze-cueing as a solution to scene complexity have relied principally on two-dimensional static scenes and manual peripheral inputs. Two experiments were run to address the question of agent cues on human-computer interfaces. Both experiments measured the efficiency of agent cues analyzing participant responses either by gaze or by touch respectively. In the first experiment, an eye-movement recorder was used to directly assess the immediate overt allocation of attention by capturing the participant’s eyefixations following presentation of a cueing stimulus. We found that a fully animated agent could speed up user interaction with the interface. When user attention was directed using a fully animated agent cue, users responded 35% faster when compared with stepped 2-image agent cues, and 42% faster when compared with a static 1-image cue. The second experiment recorded participant responses on a touch screen using same agent cues. Analysis of touch inputs confirmed the results of gaze-experiment, where fully animated agent made shortest time response with a slight decrease on the time difference comparisons. Responses to fully animated agent were 17% and 20% faster when compared with 2-image and 1-image cue severally. These results inform techniques aimed at engaging users’ attention in complex scenes such as computer games and digital transactions within public or social interaction contexts by demonstrating the benefits of dynamic gaze and head cueing directly on the users’ eye movements and touch responses.
Resumo:
A neural model is proposed of how laminar interactions in the visual cortex may learn and recognize object texture and form boundaries. The model brings together five interacting processes: region-based texture classification, contour-based boundary grouping, surface filling-in, spatial attention, and object attention. The model shows how form boundaries can determine regions in which surface filling-in occurs; how surface filling-in interacts with spatial attention to generate a form-fitting distribution of spatial attention, or attentional shroud; how the strongest shroud can inhibit weaker shrouds; and how the winning shroud regulates learning of texture categories, and thus the allocation of object attention. The model can discriminate abutted textures with blurred boundaries and is sensitive to texture boundary attributes like discontinuities in orientation and texture flow curvature as well as to relative orientations of texture elements. The model quantitatively fits a large set of human psychophysical data on orientation-based textures. Object boundar output of the model is compared to computer vision algorithms using a set of human segmented photographic images. The model classifies textures and suppresses noise using a multiple scale oriented filterbank and a distributed Adaptive Resonance Theory (dART) classifier. The matched signal between the bottom-up texture inputs and top-down learned texture categories is utilized by oriented competitive and cooperative grouping processes to generate texture boundaries that control surface filling-in and spatial attention. Topdown modulatory attentional feedback from boundary and surface representations to early filtering stages results in enhanced texture boundaries and more efficient learning of texture within attended surface regions. Surface-based attention also provides a self-supervising training signal for learning new textures. Importance of the surface-based attentional feedback in texture learning and classification is tested using a set of textured images from the Brodatz micro-texture album. Benchmark studies vary from 95.1% to 98.6% with attention, and from 90.6% to 93.2% without attention.
Resumo:
We present an analysis of high resolution VLT-FLAMES spectra of 61 B-type stars with relatively narrow-lined spectra located in 4 fields centered on the Milky Way clusters; NGC 3293 and NGC 4755 and the Large and Small Magellanic cloud clusters; NGC 2004 and NGC 330. For each object a quantitative analysis was carried out using the non-LTE model atmosphere code TLUSTY; resulting in the determination of their atmospheric parameters and photospheric abundances of the dominant metal species (C, N, O, Mg, Si, Fe). The results are discussed in relation to our earlier work on 3 younger clusters in these galaxies; NGC 6611, N11 and NGC 346 paying particular attention to the nitrogen abundances which are an important probe of the role of rotation in the evolution of stars. This work along with that of the younger clusters provides a consistent dataset of abundances and atmospheric parameters for over 100 B-type stars in the three galaxies. We provide effective temperature scales for B-type dwarfs in all three galaxies and for giants and supergiants in the SMC and LMC. In each galaxy a dependence on luminosity is found between the three classes with the unevolved dwarf objects having significantly higher effective temperatures. A metallicity dependence is present between the SMC and Galactic dwarf objects, and whilst the LMC stars are only slightly cooler than the SMC stars, they are significantly hotter than their Galactic counterparts.
Resumo:
Context. Considerable demand exists for electron excitation data for Ni ii, since lines from this abundant ion are observed in a wide variety of laboratory and astrophysical spectra. The accurate theoretical determination of these data can present a significant challenge however, due to complications arising from the presence of an open 3d-shell in the description of the target ion. Aims. In this work we present collision strengths and Maxwellian averaged effective collision strengths for the electron-impact ex- citation of Ni ii. Attention is concentrated on the 153 forbidden fine-structure transitions between the energetically lowest 18 levels of Ni ii. Effective collision strengths have been evaluated at 27 individual electron temperatures ranging from 30–100 000 K. To our knowledge this is the most extensive theoretical collisional study carried out on this ion to date.Methods. The parallel R-matrix package RMATRX II has recently been extended to allow for the inclusion of relativistic effects. This suite of codes has been utilised in the present work in conjunction with PSTGF to evaluate collision strengths and effective collision strengths for all of the low-lying forbidden fine-structure transitions. The following basis configurations were included in the target model – 3d9 , 3d8 4s, 3d8 4p, 3d7 4s2 and 3d7 4s4p – giving rise to a sophisticated 295 j j-level, 1930 coupled channel scattering problem. Results. Comprehensive comparisons are made between the present collisional data and those obtained from earlier theoretical evaluations. While the effective collision strengths agree well for some transitions, significant discrepancies exist for others.
Resumo:
In this paper, we present collision strengths and Maxwellian averaged effective collision strengths for the electron-impact excitation of Ni II. Attention is expressly concentrated on the optically allowed fine-structure transitions between the 3d 9, 3d 84s, and 3d 74s 2 even parity levels and the 3d 84p and 3d 74s 4p odd parity levels. The parallel RMATRXII R-matrix package has been recently extended to allow for the inclusion of relativistic fine-structure effects. This suite of codes has been utilized in conjunction with the parallel PSTGF and PSTGICF programs in order to compute converged total collision strengths for the allowed transitions with which this study is concerned. All 113 LS terms identified with the 3d 9, 3d 84s, 3d 74s 2, 3d 84p, and 3d 74s 4p basis configurations were included in the target wavefunction representation, giving rise to a sophisticated 295 jj-level, 1930 coupled channel scattering complex. Maxwellian averaged effective collision strengths have been computed at 30 individual electron temperatures ranging from 30 to 1,000,000 K. This range comfortably encompasses all temperatures significant to astrophysical and plasma applications. The convergence of the collision strengths is exhaustively investigated and comparisons are made with previous theoretical works, where significant discrepancies exist for the majority of transitions. We conclude that intrinsic in achieving converged collision strengths and thus effective collision strengths for the allowed transitions is the combined inclusion of contributions from the (N + 1) partial waves extending to a total angular momentum value of L = 50 and further contributions from even higher partial waves accomplished by employing a "top-up" procedure.
Resumo:
Fixed and wireless networks are increasingly converging towards common connectivity with IP-based core networks. Providing effective end-to-end resource and QoS management in such complex heterogeneous converged network scenarios requires unified, adaptive and scalable solutions to integrate and co-ordinate diverse QoS mechanisms of different access technologies with IP-based QoS. Policy-Based Network Management (PBNM) is one approach that could be employed to address this challenge. Hence, a policy-based framework for end-to-end QoS management in converged networks, CNQF (Converged Networks QoS Management Framework) has been proposed within our project. In this paper, the CNQF architecture, a Java implementation of its prototype and experimental validation of key elements are discussed. We then present a fuzzy-based CNQF resource management approach and study the performance of our implementation with real traffic flows on an experimental testbed. The results demonstrate the efficacy of our resource-adaptive approach for practical PBNM systems
Resumo:
Studies of urban metabolism provide important insights for environmental management of cities, but are not widely used in planning practice due to a mismatch of data scale and coverage. This paper introduces the Spatial Allocation of Material Flow Analysis (SAMFA) model as a potential decision support tool aimed as a contribution to overcome some of these difficulties and describes its pilot use at the county level in the Republic of Ireland. The results suggest that SAMFA is capable of identifying hotspots of higher material and energy use to support targeted planning initiatives, while its ability to visualise different policy scenarios supports more effective multi-stakeholder engagement. The paper evaluates this pilot use and sets out how this model can act as an analytical platform for the industrial ecology–spatial planning nexus.
Resumo:
Background The use of technology in healthcare settings is on the increase and may represent a cost-effective means of delivering rehabilitation. Reductions in treatment time, and delivery in the home, are also thought to be benefits of this approach. Children and adolescents with brain injury often experience deficits in memory and executive functioning that can negatively affect their school work, social lives, and future occupations. Effective interventions that can be delivered at home, without the need for high-cost clinical involvement, could provide a means to address a current lack of provision. We have systematically reviewed studies examining the effects of technology-based interventions for the rehabilitation of deficits in memory and executive functioning in children and adolescents with acquired brain injury. Objectives To assess the effects of technology-based interventions compared to placebo intervention, no treatment, or other types of intervention, on the executive functioning and memory of children and adolescents with acquired brain injury. Search methods We ran the search on the 30 September 2015. We searched the Cochrane Injuries Group Specialised Register, the Cochrane Central Register of Controlled Trials (CENTRAL), Ovid MEDLINE(R), Ovid MEDLINE(R) In-Process & Other Non-Indexed Citations, Ovid MEDLINE(R) Daily and Ovid OLDMEDLINE(R), EMBASE Classic + EMBASE (OvidSP), ISI Web of Science (SCI-EXPANDED, SSCI, CPCI-S, and CPSI-SSH), CINAHL Plus (EBSCO), two other databases, and clinical trials registers. We also searched the internet, screened reference lists, and contacted authors of included studies. Selection criteria Randomised controlled trials comparing the use of a technological aid for the rehabilitation of children and adolescents with memory or executive-functioning deficits with placebo, no treatment, or another intervention. Data collection and analysis Two review authors independently reviewed titles and abstracts identified by the search strategy. Following retrieval of full-text manuscripts, two review authors independently performed data extraction and assessed the risk of bias. Main results Four studies (involving 206 participants) met the inclusion criteria for this review. Three studies, involving 194 participants, assessed the effects of online interventions to target executive functioning (that is monitoring and changing behaviour, problem solving, planning, etc.). These studies, which were all conducted by the same research team, compared online interventions against a 'placebo' (participants were given internet resources on brain injury). The interventions were delivered in the family home with additional support or training, or both, from a psychologist or doctoral student. The fourth study investigated the use of a computer program to target memory in addition to components of executive functioning (that is attention, organisation, and problem solving). No information on the study setting was provided, however a speech-language pathologist, teacher, or occupational therapist accompanied participants. Two studies assessed adolescents and young adults with mild to severe traumatic brain injury (TBI), while the remaining two studies assessed children and adolescents with moderate to severe TBI. Risk of bias We assessed the risk of selection bias as low for three studies and unclear for one study. Allocation bias was high in two studies, unclear in one study, and low in one study. Only one study (n = 120) was able to conceal allocation from participants, therefore overall selection bias was assessed as high. One study took steps to conceal assessors from allocation (low risk of detection bias), while the other three did not do so (high risk of detection bias). Primary outcome 1: Executive functioning: Technology-based intervention versus placebo Results from meta-analysis of three studies (n = 194) comparing online interventions with a placebo for children and adolescents with TBI, favoured the intervention immediately post-treatment (standardised mean difference (SMD) -0.37, 95% confidence interval (CI) -0.66 to -0.09; P = 0.62; I2 = 0%). (As there is no 'gold standard' measure in the field, we have not translated the SMD back to any particular scale.) This result is thought to represent only a small to medium effect size (using Cohen’s rule of thumb, where 0.2 is a small effect, 0.5 a medium one, and 0.8 or above is a large effect); this is unlikely to have a clinically important effect on the participant. The fourth study (n = 12) reported differences between the intervention and control groups on problem solving (an important component of executive functioning). No means or standard deviations were presented for this outcome, therefore an effect size could not be calculated. The quality of evidence for this outcome according to GRADE was very low. This means future research is highly likely to change the estimate of effect. Primary outcome 2: Memory One small study (n = 12) reported a statistically significant difference in improvement in sentence recall between the intervention and control group following an eight-week remediation programme. No means or standard deviations were presented for this outcome, therefore an effect size could not be calculated. Secondary outcomes Two studies (n = 158) reported on anxiety/depression as measured by the Child Behavior Checklist (CBCL) and were included in a meta-analysis. We found no evidence of an effect with the intervention (mean difference -5.59, 95% CI -11.46 to 0.28; I2 = 53%). The GRADE quality of evidence for this outcome was very low, meaning future research is likely to change the estimate of effect. A single study sought to record adverse events and reported none. Two studies reported on use of the intervention (range 0 to 13 and 1 to 24 sessions). One study reported on social functioning/social competence and found no effect. The included studies reported no data for other secondary outcomes (that is quality of life and academic achievement). Authors' conclusions This review provides low-quality evidence for the use of technology-based interventions in the rehabilitation of executive functions and memory for children and adolescents with TBI. As all of the included studies contained relatively small numbers of participants (12 to 120), our findings should be interpreted with caution. The involvement of a clinician or therapist, rather than use of the technology, may have led to the success of these interventions. Future research should seek to replicate these findings with larger samples, in other regions, using ecologically valid outcome measures, and reduced clinician involvement.