17 resultados para Licensing Agreements
em Indian Institute of Science - Bangalore - Índia
Resumo:
During the last decade, developing countries such as India have been exhibiting rapid increase in human population and vehicles, and increase in road accidents. Inappropriate driving behaviour is considered one of the major causes of road accidents in India as compared to defective geometric design of pavement or mechanical defects in vehicles. It can result in conditions such as lack of lane discipline, disregard to traffic laws, frequent traffic violations, increase in crashes due to self-centred driving, etc. It also demotivates educated drivers from following good driving practices. Hence, improved driver behaviour can be an effective countermeasure to reduce the vulnerability of road users and inhibit crash risks. This article highlights improved driver behaviour through better driver education, driver training and licensing procedures along with good on-road enforcement; as an effective countermeasure to ensure road safety in India. Based on the review and analysis, the article also recommends certain measures pertaining to driver licensing and traffic law enforcement in India aimed at improving road safety.
Resumo:
The paper presents a method for transmission loss charge allocation in deregulated power systems based on Relative Electrical Distance (RED) concept. Based on RED between the generator and load nodes and the predefined bilateral power contracts, charge evaluation is carried out. Generally through some power exchange mechanism a set of bilateral contracts are determined that facilitate bilateral agreements between the generation and distribution entities. In this paper the possible charges incurred in meeting loads like generation charge, transmission charge and charge due to losses are evaluated. Case studies have been carried out on a few practical equivalent systems. Due to space limitation results for a sample 5 bus system are presented considering ideal load/generation power contracts and deviated load/generation power contracts. Extensive numerical testing indicates that the proposed allocation scheme produces loss allocations that are appropriate and that behave in a physically reasonable manner.
Resumo:
Active regions on the solar surface are known to possess magnetic helicity, which is predominantly negative in the northern hemisphere and positive in the southern hemisphere. Choudhuri et al. [Choudhuri, A.R. On the connection between mean field dynamo theory and flux tubes. Solar Phys. 215, 31–55, 2003] proposed that the magnetic helicity arises due to the wrapping up of the poloidal field of the convection zone around rising flux tubes which form active regions. Choudhuri [Choudhuri, A.R., Chatterjee, P., Nandy, D. Helicity of solar active regions from a dynamo model. ApJ 615, L57–L60, 2004] used this idea to calculate magnetic helicity from their solar dynamo model. Apart from getting broad agreements with observational data, they also predict that the hemispheric helicity rule may be violated at the beginning of a solar cycle. Chatterjee et al. [Chatterjee, P., Choudhuri, A.R., Petrovay, K. Development of twist in an emerging magnetic flux tube by poloidal field accretion. A&A 449, 781–789, 2006] study the penetration of the wrapped poloidal field into the rising flux tube due to turbulent diffusion using a simple 1-d model. They find that the extent of penetration of the wrapped field will depend on how weak the magnetic field inside the rising flux tube becomes before its emergence. They conclude that more detailed observational data will throw light on the physical conditions of flux tubes just before their emergence to the photosphere.
Resumo:
Business processes and application functionality are becoming available as internal web services inside enterprise boundaries as well as becoming available as commercial web services from enterprise solution vendors and web services marketplaces. Typically there are multiple web service providers offering services capable of fulfilling a particular functionality, although with different Quality of Service (QoS). Dynamic creation of business processes requires composing an appropriate set of web services that best suit the current need. This paper presents a novel combinatorial auction approach to QoS aware dynamic web services composition. Such an approach would enable not only stand-alone web services but also composite web services to be a part of a business process. The combinatorial auction leads to an integer programming formulation for the web services composition problem. An important feature of the model is the incorporation of service level agreements. We describe a software tool QWESC for QoS-aware web services composition based on the proposed approach.
Resumo:
The move towards IT outsourcing is the first step towards an environment where compute infrastructure is treated as a service. In utility computing this IT service has to honor Service Level Agreements (SLA) in order to meet the desired Quality of Service (QoS) guarantees. Such an environment requires reliable services in order to maximize the utilization of the resources and to decrease the Total Cost of Ownership (TCO). Such reliability cannot come at the cost of resource duplication, since it increases the TCO of the data center and hence the cost per compute unit. We, in this paper, look into aspects of projecting impact of hardware failures on the SLAs and techniques required to take proactive recovery steps in case of a predicted failure. By maintaining health vectors of all hardware and system resources, we predict the failure probability of resources based on observed hardware errors/failure events, at runtime. This inturn influences an availability aware middleware to take proactive action (even before the application is affected in case the system and the application have low recoverability). The proposed framework has been prototyped on a system running HP-UX. Our offline analysis of the prediction system on hardware error logs indicate no more than 10% false positives. This work to the best of our knowledge is the first of its kind to perform an end-to-end analysis of the impact of a hardware fault on application SLAs, in a live system.
Resumo:
Analysts have identified four related questions that need to be asked and answered before agreements to respond to global warming will be possible.1 Which countries bear responsibility for causing the problem? What quantities and mix of greenhouse gases should each country be allowed to emit? Which countries have the resources to do something about the problem? Where are the best opportunities for undertaking projects to respond to the problem? Failure to distinguish among these four questions, or willingness to accept superficial answers, promotes unnecessary controversy.
Resumo:
A simple yet accurate equivalent circuit model was developed for the analysis of slow-wave properties (dispersion and interaction impedance characteristics) of a rectangular folded-waveguide slow-wave structure. Present formulation includes the effects of the presence of beam-hole in the circuit, which were ignored in existing approaches. The analysis was benchmarked against measurement as well as with 3D electromagnetic modeling using MAFIA for two typical slow-wave structures operating in Ka- and Q-bands, and close agreements were observed. The analysis was extended for demonstrating the effect of the variation of beam-hole radius on the RF interaction efficiency of the device. (C) 2009 Elsevier GmbH. All rights reserved.
Resumo:
Electronic, magnetic, and structural properties of graphene flakes depend sensitively upon the type of edge atoms. We present a simple software tool for determining the type of edge atoms in a honeycomb lattice. The algorithm is based on nearest neighbor counting. Whether an edge atom is of armchair or zigzag type is decided by the unique pattern of its nearest neighbors. Particular attention is paid to the practical aspects of using the tool, as additional features such as extracting out the edges from the lattice could help in analyzing images from transmission microscopy or other experimental probes. Ultimately, the tool in combination with density-functional theory or tight-binding method can also be helpful in correlating the properties of graphene flakes with the different armchair-to-zigzag ratios. Program summary Program title: edgecount Catalogue identifier: AEIA_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEIA_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 66685 No. of bytes in distributed program, including test data, etc.: 485 381 Distribution format: tar.gz Programming language: FORTRAN 90/95 Computer: Most UNIX-based platforms Operating system: Linux, Mac OS Classification: 16.1, 7.8 Nature of problem: Detection and classification of edge atoms in a finite patch of honeycomb lattice. Solution method: Build nearest neighbor (NN) list; assign types to edge atoms on the basis of their NN pattern. Running time: Typically similar to second(s) for all examples. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The amount of reactive power margin available in a system determines its proximity to voltage instability under normal and emergency conditions. More the reactive power margin, better is the systems security and vice-versa. A hypothetical way of improving the reactive margin of a synchronous generator is to reduce the real power generation within its mega volt-ampere (MVA) ratings. This real power generation reduction will affect its power contract agreements entered in the electricity market. Owing to this, the benefit that the generator foregoes will have to be compensated by paying them some lost opportunity cost. The objective of this study is three fold. Firstly, the reactive power margins of the generators are evaluated. Secondly, they are improved using a reactive power optimization technique and optimally placed unified power flow controllers. Thirdly, the reactive power capacity exchanges along the tie-lines are evaluated under base case and improved conditions. A detailed analysis of all the reactive power sources and sinks scattered throughout the network is carried out in the study. Studies are carried out on a real life, three zone, 72-bus equivalent Indian southern grid considering normal and contingency conditions with base case operating point and optimised results presented.
Resumo:
In this paper we develop an analytical heat transfer model, which is capable of analyzing cyclic melting and solidification processes of a phase change material used in the context of electronics cooling systems. The model is essentially based on conduction heat transfer, with treatments for convection and radiation embedded inside. The whole solution domain is first divided into two main sub-domains, namely, the melting sub-domain and the solidification sub-domain. Each sub-domain is then analyzed for a number of temporal regimes. Accordingly, analytical solutions for temperature distribution within each subdomain are formulated either using a semi-infinity consideration, or employing a method of quasi-steady state, depending on the applicability. The solution modules are subsequently united, leading to a closed-form solution for the entire problem. The analytical solutions are then compared with experimental and numerical solutions for a benchmark problem quoted in the literature, and excellent agreements can be observed.
Resumo:
We consider a setting in which several operators offer downlink wireless data access services in a certain geographical region. Each operator deploys several base stations or access points, and registers some subscribers. In such a situation, if operators pool their infrastructure, and permit the possibility of subscribers being served by any of the cooperating operators, then there can be overall better user satisfaction, and increased operator revenue. We use coalitional game theory to investigate such resource pooling and cooperation between operators.We use utility functions to model user satisfaction, and show that the resulting coalitional game has the property that if all operators cooperate (i.e., form a grand coalition) then there is an operating point that maximizes the sum utility over the operators while providing the operators revenues such that no subset of operators has an incentive to break away from the coalition. We investigate whether such operating points can result in utility unfairness between users of the various operators. We also study other revenue sharing concepts, namely, the nucleolus and the Shapely value. Such investigations throw light on criteria for operators to accept or reject subscribers, based on the service level agreements proposed by them. We also investigate the situation in which only certain subsets of operators may be willing to cooperate.
Resumo:
This paper presents an intelligent procurement marketplace for finding the best mix of web services to dynamically compose the business process desired by a web service requester. We develop a combinatorial auction approach that leads to an integer programming formulation for the web services composition problem. The model takes into account the Quality of Service (QoS) and Service Level Agreements (SLA) for differentiating among multiple service providers who are capable of fulfilling a functionality. An important feature of the model is interface aware composition.
Resumo:
Together with 106 farmers who started growing Jatropha (Jatropha curcas L.) in 20042006, this research sought to increase the knowledge around the real-life experience of Jatropha farming in the southern India states of Tamil Nadu and Andhra Pradesh. Launched as an alternative for diesel in India, Jatropha has been promoted as a non-edible plant that could grow on poor soils, yield oil-rich seeds for production of bio-diesel, and not compete directly with food production. Through interviews with the farmers, information was gathered regarding their socio-economic situation, the implementation and performance of their Jatropha plantations, and their reasons for continuing or discontinuing Jatropha cultivation. Results reveal that 82% of the farmers had substituted former cropland for their Jatropha cultivation. By 2010, 85% (n = 90) of the farmers who cultivated Jatropha in 2004 had stopped. Cultivating the crop did not give the economic returns the farmers anticipated, mainly due to a lack of information about the crop and its maintenance during cultivation and due to water scarcity. A majority of the farmers irrigated and applied fertilizer, and even pesticides. Many problems experienced by the farmers were due to limited knowledge about cultivating Jatropha caused by poor planning and implementation of the national Jatropha program. Extension services, subsidies, and other support were not provided as promised. The farmers who continued cultivation had means of income other than Jatropha and held hopes of a future Jatropha market. The lack of market structures, such as purchase agreements and buyers, as well as a low retail price for the seeds, were frequently stated as barriers to Jatropha cultivation. For Jatropha biodiesel to perform well, efforts are needed to improve yield levels and stability through genetic improvements and drought tolerance, as well as agriculture extension services to support adoption of the crop. Government programs will -probably be more effective if implementing biodiesel production is conjoined with stimulating the demand for Jatropha biodiesel. To avoid food-biofuel competition, additional measures may be needed such as land-use restrictions for Jatropha producers and taxes on biofuels or biofuel feedstocks to improve the competitiveness of the food sector compared to the bioenergy sector. (c) 2012 Society of Chemical Industry and John Wiley & Sons, Ltd
Resumo:
We consider the problem of secure communication in mobile Wireless Sensor Networks (WSNs). Achieving security in WSNs requires robust encryption and authentication standards among the sensor nodes. Severe resources constraints in typical Wireless Sensor nodes hinder them in achieving key agreements. It is proved from past studies that many notable key management schemes do not work well in sensor networks due to their limited capacities. The idea of key predistribution is not feasible considering the fact that the network could scale to millions. We prove a novel algorithm that provides robust and secure communication channel in WSNs. Our Double Encryption with Validation Time (DEV) using Key Management Protocol algorithm works on the basis of timed sessions within which a secure secret key remains valid. A mobile node is used to bootstrap and exchange secure keys among communicating pairs of nodes. Analysis and simulation results show that the performance of the DEV using Key Management Protocol Algorithm is better than the SEV scheme and other related work.
Resumo:
Accurate supersymmetric spectra are required to confront data from direct and indirect searches of supersymmetry. SuSeFLAV is a numerical tool capable of computing supersymmetric spectra precisely for various supersymmetric breaking scenarios applicable even in the presence of flavor violation. The program solves MSSM RGEs with complete 3 x 3 flavor mixing at 2-loop level and one loop finite threshold corrections to all MSSM parameters by incorporating radiative electroweak symmetry breaking conditions. The program also incorporates the Type-I seesaw mechanism with three massive right handed neutrinos at user defined mass scales and mixing. It also computes branching ratios of flavor violating processes such as l(j) -> l(i)gamma, l(j) -> 3 l(i), b -> s gamma and supersymmetric contributions to flavor conserving quantities such as (g(mu) - 2). A large choice of executables suitable for various operations of the program are provided. Program summary Program title: SuSeFLAV Catalogue identifier: AEOD_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEOD_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License No. of lines in distributed program, including test data, etc.: 76552 No. of bytes in distributed program, including test data, etc.: 582787 Distribution format: tar.gz Programming language: Fortran 95. Computer: Personal Computer, Work-Station. Operating system: Linux, Unix. Classification: 11.6. Nature of problem: Determination of masses and mixing of supersymmetric particles within the context of MSSM with conserved R-parity with and without the presence of Type-I seesaw. Inter-generational mixing is considered while calculating the mass spectrum. Supersymmetry breaking parameters are taken as inputs at a high scale specified by the mechanism of supersymmetry breaking. RG equations including full inter-generational mixing are then used to evolve these parameters up to the electroweak breaking scale. The low energy supersymmetric spectrum is calculated at the scale where successful radiative electroweak symmetry breaking occurs. At weak scale standard model fermion masses, gauge couplings are determined including the supersymmetric radiative corrections. Once the spectrum is computed, the program proceeds to various lepton flavor violating observables (e.g., BR(mu -> e gamma), BR(tau -> mu gamma) etc.) at the weak scale. Solution method: Two loop RGEs with full 3 x 3 flavor mixing for all supersymmetry breaking parameters are used to compute the low energy supersymmetric mass spectrum. An adaptive step size Runge-Kutta method is used to solve the RGEs numerically between the high scale and the electroweak breaking scale. Iterative procedure is employed to get the consistent radiative electroweak symmetry breaking condition. The masses of the supersymmetric particles are computed at 1-loop order. The third generation SM particles and the gauge couplings are evaluated at the 1-loop order including supersymmetric corrections. A further iteration of the full program is employed such that the SM masses and couplings are consistent with the supersymmetric particle spectrum. Additional comments: Several executables are presented for the user. Running time: 0.2 s on a Intel(R) Core(TM) i5 CPU 650 with 3.20 GHz. (c) 2012 Elsevier B.V. All rights reserved.