942 resultados para real-effort task


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A real-time operational methodology has been developed for multipurpose reservoir operation for irrigation and hydropower generation with application to the Bhadra reservoir system in the state of Karnataka, India. The methodology consists of three phases of computer modelling. In the first phase, the optimal release policy for a given initial storage and inflow is determined using a stochastic dynamic programming (SDP) model. Streamflow forecasting using an adaptive AutoRegressive Integrated Moving Average (ARIMA) model constitutes the second phase. A real-time simulation model is developed in the third phase using the forecast inflows of phase 2 and the operating policy of phase 1. A comparison of the optimal monthly real-time operation with the historical operation demonstrates the relevance, applicability and the relative advantage of the proposed methodology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The open development model of software production has been characterized as the future model of knowledge production and distributed work. Open development model refers to publicly available source code ensured by an open source license, and the extensive and varied distributed participation of volunteers enabled by the Internet. Contemporary spokesmen of open source communities and academics view open source development as a new form of volunteer work activity characterized by hacker ethic and bazaar governance . The development of the Linux operating system is perhaps the best know example of such an open source project. It started as an effort by a user-developer and grew quickly into a large project with hundreds of user-developer as contributors. However, in hybrids , in which firms participate in open source projects oriented towards end-users, it seems that most users do not write code. The OpenOffice.org project, initiated by Sun Microsystems, in this study represents such a project. In addition, the Finnish public sector ICT decision-making concerning open source use is studied. The purpose is to explore the assumptions, theories and myths related to the open development model by analysing the discursive construction of the OpenOffice.org community: its developers, users and management. The qualitative study aims at shedding light on the dynamics and challenges of community construction and maintenance, and related power relations in hybrid open source, by asking two main research questions: How is the structure and membership constellation of the community, specifically the relation between developers and users linguistically constructed in hybrid open development? What characterizes Internet-mediated virtual communities and how can they be defined? How do they differ from hierarchical forms of knowledge production on one hand and from traditional volunteer communities on the other? The study utilizes sociological, psychological and anthropological concepts of community for understanding the connection between the real and the imaginary in so-called virtual open source communities. Intermediary methodological and analytical concepts are borrowed from discourse and rhetorical theories. A discursive-rhetorical approach is offered as a methodological toolkit for studying texts and writing in Internet communities. The empirical chapters approach the problem of community and its membership from four complementary points of views. The data comprises mailing list discussion, personal interviews, web page writings, email exchanges, field notes and other historical documents. The four viewpoints are: 1) the community as conceived by volunteers 2) the individual contributor s attachment to the project 3) public sector organizations as users of open source 4) the community as articulated by the community manager. I arrive at four conclusions concerning my empirical studies (1-4) and two general conclusions (5-6). 1) Sun Microsystems and OpenOffice.org Groupware volunteers failed in developing necessary and sufficient open code and open dialogue to ensure collaboration thus splitting the Groupware community into volunteers we and the firm them . 2) Instead of separating intrinsic and extrinsic motivations, I find that volunteers unique patterns of motivations are tied to changing objects and personal histories prior and during participation in the OpenOffice.org Lingucomponent project. Rather than seeing volunteers as a unified community, they can be better understood as independent entrepreneurs in search of a collaborative community . The boundaries between work and hobby are blurred and shifting, thus questioning the usefulness of the concept of volunteer . 3) The public sector ICT discourse portrays a dilemma and tension between the freedom to choose, use and develop one s desktop in the spirit of open source on one hand and the striving for better desktop control and maintenance by IT staff and user advocates, on the other. The link between the global OpenOffice.org community and the local end-user practices are weak and mediated by the problematic IT staff-(end)user relationship. 4) Authoring community can be seen as a new hybrid open source community-type of managerial practice. The ambiguous concept of community is a powerful strategic tool for orienting towards multiple real and imaginary audiences as evidenced in the global membership rhetoric. 5) The changing and contradictory discourses of this study show a change in the conceptual system and developer-user relationship of the open development model. This change is characterized as a movement from hacker ethic and bazaar governance to more professionally and strategically regulated community. 6) Community is simultaneously real and imagined, and can be characterized as a runaway community . Discursive-action can be seen as a specific type of online open source engagement. Hierarchies and structures are created through discursive acts. Key words: Open Source Software, open development model, community, motivation, discourse, rhetoric, developer, user, end-user

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose to compress weighted graphs (networks), motivated by the observation that large networks of social, biological, or other relations can be complex to handle and visualize. In the process also known as graph simplication, nodes and (unweighted) edges are grouped to supernodes and superedges, respectively, to obtain a smaller graph. We propose models and algorithms for weighted graphs. The interpretation (i.e. decompression) of a compressed, weighted graph is that a pair of original nodes is connected by an edge if their supernodes are connected by one, and that the weight of an edge is approximated to be the weight of the superedge. The compression problem now consists of choosing supernodes, superedges, and superedge weights so that the approximation error is minimized while the amount of compression is maximized. In this paper, we formulate this task as the 'simple weighted graph compression problem'. We then propose a much wider class of tasks under the name of 'generalized weighted graph compression problem'. The generalized task extends the optimization to preserve longer-range connectivities between nodes, not just individual edge weights. We study the properties of these problems and propose a range of algorithms to solve them, with dierent balances between complexity and quality of the result. We evaluate the problems and algorithms experimentally on real networks. The results indicate that weighted graphs can be compressed efficiently with relatively little compression error.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fallibility is inherent in human cognition and so a system that will monitor performance is indispensable. While behavioral evidence for such a system derives from the finding that subjects slow down after trials that are likely to produce errors, the neural and behavioral characterization that enables such control is incomplete. Here, we report a specific role for dopamine/basal ganglia in response conflict by accessing deficits in performance monitoring in patients with Parkinson's disease. To characterize such a deficit, we used a modification of the oculomotor countermanding task to show that slowing down of responses that generate robust response conflict, and not post-error per se, is deficient in Parkinson's disease patients. Poor performance adjustment could be either due to impaired ability to slow RT subsequent to conflicts or due to impaired response conflict recognition. If the latter hypothesis was true, then PD subjects should show evidence of impaired error detection/correction, which was found to be the case. These results make a strong case for impaired performance monitoring in Parkinson's patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Inspite of numerous research advancements made in recent years in the area of formal techniques, specification of real-time systems is still proving to be a very challenging and difficult problem. In this context, this paper critically examines state-of-the-art specification techniques for real-time systems and analyzes the emerging trends.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our study concerns an important current problem, that of diffusion of information in social networks. This problem has received significant attention from the Internet research community in the recent times, driven by many potential applications such as viral marketing and sales promotions. In this paper, we focus on the target set selection problem, which involves discovering a small subset of influential players in a given social network, to perform a certain task of information diffusion. The target set selection problem manifests in two forms: 1) top-k nodes problem and 2) lambda-coverage problem. In the top-k nodes problem, we are required to find a set of k key nodes that would maximize the number of nodes being influenced in the network. The lambda-coverage problem is concerned with finding a set of k key nodes having minimal size that can influence a given percentage lambda of the nodes in the entire network. We propose a new way of solving these problems using the concept of Shapley value which is a well known solution concept in cooperative game theory. Our approach leads to algorithms which we call the ShaPley value-based Influential Nodes (SPINs) algorithms for solving the top-k nodes problem and the lambda-coverage problem. We compare the performance of the proposed SPIN algorithms with well known algorithms in the literature. Through extensive experimentation on four synthetically generated random graphs and six real-world data sets (Celegans, Jazz, NIPS coauthorship data set, Netscience data set, High-Energy Physics data set, and Political Books data set), we show that the proposed SPIN approach is more powerful and computationally efficient. Note to Practitioners-In recent times, social networks have received a high level of attention due to their proven ability in improving the performance of web search, recommendations in collaborative filtering systems, spreading a technology in the market using viral marketing techniques, etc. It is well known that the interpersonal relationships (or ties or links) between individuals cause change or improvement in the social system because the decisions made by individuals are influenced heavily by the behavior of their neighbors. An interesting and key problem in social networks is to discover the most influential nodes in the social network which can influence other nodes in the social network in a strong and deep way. This problem is called the target set selection problem and has two variants: 1) the top-k nodes problem, where we are required to identify a set of k influential nodes that maximize the number of nodes being influenced in the network and 2) the lambda-coverage problem which involves finding a set of influential nodes having minimum size that can influence a given percentage lambda of the nodes in the entire network. There are many existing algorithms in the literature for solving these problems. In this paper, we propose a new algorithm which is based on a novel interpretation of information diffusion in a social network as a cooperative game. Using this analogy, we develop an algorithm based on the Shapley value of the underlying cooperative game. The proposed algorithm outperforms the existing algorithms in terms of generality or computational complexity or both. Our results are validated through extensive experimentation on both synthetically generated and real-world data sets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fiber bragg grating (FBG) sensors have been widely used for number of sensing applications like temperature, pressure, acousto-ultrasonic, static and dynamic strain, refractive index change measurements and so on. Present work demonstrates the use of FBG sensors in in-situ measurement of vacuum process with simultaneous leak detection capability. Experiments were conducted in a bell jar vacuum chamber facilitated with conventional Pirani gauge for vacuum measurement. Three different experiments have been conducted to validate the performance of FBG sensor in monitoring vacuum creating process and air bleeding. The preliminary results of FBG sensors in vacuum monitoring have been compared with that of commercial Pirani gauge sensor. This novel technique offers a simple alternative to conventional method for real time monitoring of evacuation process. Proposed FBG based vacuum sensor has potential applications in vacuum systems involving hazardous environment such as chemical and gas plants, automobile industries, aeronautical establishments and leak monitoring in process industries, where the electrical or MEMS based sensors are prone to explosion and corrosion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the following question: Let S (1) and S (2) be two smooth, totally-real surfaces in C-2 that contain the origin. If the union of their tangent planes is locally polynomially convex at the origin, then is S-1 boolean OR S-2 locally polynomially convex at the origin? If T (0) S (1) a (c) T (0) S (2) = {0}, then it is a folk result that the answer is yes. We discuss an obstruction to the presumed proof, and provide a different approach. When dim(R)(T0S1 boolean AND T0S2) = 1, we present a geometric condition under which no consistent answer to the above question exists. We then discuss conditions under which we can expect local polynomial convexity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new class of nets, called S-nets, is introduced for the performance analysis of scheduling algorithms used in real-time systems Deterministic timed Petri nets do not adequately model the scheduling of resources encountered in real-time systems, and need to be augmented with resource places and signal places, and a scheduler block, to facilitate the modeling of scheduling algorithms. The tokens are colored, and the transition firing rules are suitably modified. Further, the concept of transition folding is used, to get intuitively simple models of multiframe real-time systems. Two generic performance measures, called �load index� and �balance index,� which characterize the resource utilization and the uniformity of workload distribution, respectively, are defined. The utility of S-nets for evaluating heuristic-based scheduling schemes is illustrated by considering three heuristics for real-time scheduling. S-nets are useful in tuning the hardware configuration and the underlying scheduling policy, so that the system utilization is maximized, and the workload distribution among the computing resources is balanced.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biomedical engineering solutions like surgical simulators need High Performance Computing (HPC) to achieve real-time performance. Graphics Processing Units (GPUs) offer HPC capabilities at low cost and low power consumption. In this work, it is demonstrated that a liver which is discretized by about 2500 finite element nodes, can be graphically simulated in realtime, by making use of a GPU. Present work takes into consideration the time needed for the data transfer from CPU to GPU and back from GPU to CPU. Although behaviour of liver is very complicated, present computer simulation assumes linear elastostatics. One needs to use the commercial software ANSYS to obtain the global stiffness matrix of the liver. Results show that GPUs are useful for the real-time graphical simulation of liver, which in turn is needed in simulators that are used for training surgeons in laparoscopic surgery. Although the computer simulation should involve rendering also, neither rendering, nor the time needed for rendering and displaying the liver on a screen, is considered in the present work. The present work is just a demonstration of a concept; the concept is not really implemented and validated. Future work is to develop software which can accomplish real-time and very realistic graphical simulation of liver, with rendered image of liver on the screen changing in real-time according to the position of the surgical tool tip approximated as the mouse cursor in 3D.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Real-time simulation of deformable solids is essential for some applications such as biological organ simulations for surgical simulators. In this work, deformable solids are approximated to be linear elastic, and an easy and straight forward numerical technique, the Finite Point Method (FPM), is used to model three dimensional linear elastostatics. Graphics Processing Unit (GPU) is used to accelerate computations. Results show that the Finite Point Method, together with GPU, can compute three dimensional linear elastostatic responses of solids at rates suitable for real-time graphics, for solids represented by reasonable number of points.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Let D denote the open unit disk in C centered at 0. Let H-R(infinity) denote the set of all bounded and holomorphic functions defined in D that also satisfy f(z) = <(f <(z)over bar>)over bar> for all z is an element of D. It is shown that H-R(infinity) is a coherent ring.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A scheme to apply the rate-1 real orthogonal designs (RODs) in relay networks with single real-symbol decodability of the symbols at the destination for any arbitrary number of relays is proposed. In the case where the relays do not have any information about the channel gains from the source to themselves, the best known distributed space time block codes (DSTBCs) for k relays with single real-symbol decodability offer an overall rate of complex symbols per channel use. The scheme proposed in this paper offers an overall rate of 2/2+k complex symbol per channel use, which is independent of the number of relays. Furthermore, in the scenario where the relays have partial channel information in the form of channel phase knowledge, the best known DSTBCs with single real-symbol decodability offer an overall rate of 1/3 complex symbols per channel use. In this paper, making use of RODs, a scheme which achieves the same overall rate of 1/3 complex symbols per channel use but with a decoding delay that is 50 percent of that of the best known DSTBCs, is presented. Simulation results of the symbol error rate performance for 10 relays, which show the superiority of the proposed scheme over the best known DSTBC for 10 relays with single real-symbol decodability, are provided.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A framework based on the notion of "conflict-tolerance" was proposed in as a compositional methodology for developing and reasoning about systems that comprise multiple independent controllers. A central notion in this framework is that of a "conflict-tolerant" specification for a controller. In this work we propose a way of defining conflict-tolerant real-time specifications in Metric Interval Temporal Logic (MITL). We call our logic CT-MITL for Conflict-Tolerant MITL. We then give a clock optimal "delay-then-extend" construction for building a timed transition system for monitoring past-MITL formulas. We show how this monitoring transition system can be used to solve the associated verification and synthesis problems for CT-MITL.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Real-time kinetics of ligand-ligate interaction has predominantly been studied by either fluorescence or surface plasmon resonance based methods. Almost all such studies are based on association between the ligand and the ligate. This paper reports our analysis of dissociation data of monoclonal antibody-antigen (hCG) system using radio-iodinated hCG as a probe and nitrocellulose as a solid support to immobilize mAb. The data was analyzed quantitatively for a one-step and a two-step model. The data fits well into the two-step model. We also found that a fraction of what is bound is non-dissociable (tight-binding portion (TBP)). The TBP was neither an artifact of immobilization nor does it interfere with analysis. It was present when the reaction was carried out in homogeneous solution in liquid phase. The rate constants obtained from the two methods were comparable. The work reported here shows that real-time kinetics of other ligand-ligate interaction can be studied using nitrocellulose as a solid support. (C) 2002 Elsevier Science B.V. All rights reserved.