934 resultados para CAPM zero-bet
Resumo:
Soil organic carbon sequestration rates over 20 years based on the Intergovernmental Panel for Climate Change (IPCC) methodology were combined with local economic data to determine the potential for soil C sequestration in wheat-based production systems on the Indo-Gangetic Plain (IGP). The C sequestration potential of rice–wheat systems of India on conversion to no-tillage is estimated to be 44.1 Mt C over 20 years. Implementing no-tillage practices in maize–wheat and cotton–wheat production systems would yield an additional 6.6 Mt C. This offset is equivalent to 9.6% of India's annual greenhouse gas emissions (519 Mt C) from all sectors (excluding land use change and forestry), or less than one percent per annum. The economic analysis was summarized as carbon supply curves expressing the total additional C accumulated over 20 year for a price per tonne of carbon sequestered ranging from zero to USD 200. At a carbon price of USD 25 Mg C−1, 3 Mt C (7% of the soil C sequestration potential) could be sequestered over 20 years through the implementation of no-till cropping practices in rice–wheat systems of the Indian States of the IGP, increasing to 7.3 Mt C (17% of the soil C sequestration potential) at USD 50 Mg C−1. Maximum levels of sequestration could be attained with carbon prices approaching USD 200 Mg C−1 for the States of Bihar and Punjab. At this carbon price, a total of 34.7 Mt C (79% of the estimated C sequestration potential) could be sequestered over 20 years across the rice–wheat region of India, with Uttar Pradesh contributing 13.9 Mt C.
Resumo:
In this groundbreaking book, acclaimed sociologist and Pulitzer Prize finalist Elliott Currie draws on years of interviews to offer a profound investigation of what has gone wrong for so many “mainstream” American adolescents. Rejecting such predictable answers as TV violence, permissiveness, and inherent evil, Currie links this crisis to a pervasive “culture of exclusion” fostered by a society in which medications trump guidance and a punitive “zero tolerance” approach to adolescent misbehavior has become the norm. Broadening his inquiry, he dissects the changes in middle-class life that stratify the world into "winners" and "losers," imposing an extraordinarily harsh culture—and not just on kids. Vivid, compelling, and deeply empathetic, The Road to Whatever is a stark indictment of a society that has lost the will—or the capacity—to care.
Resumo:
Secrecy of decryption keys is an important pre-requisite for security of any encryption scheme and compromised private keys must be immediately replaced. \emph{Forward Security (FS)}, introduced to Public Key Encryption (PKE) by Canetti, Halevi, and Katz (Eurocrypt 2003), reduces damage from compromised keys by guaranteeing confidentiality of messages that were encrypted prior to the compromise event. The FS property was also shown to be achievable in (Hierarchical) Identity-Based Encryption (HIBE) by Yao, Fazio, Dodis, and Lysyanskaya (ACM CCS 2004). Yet, for emerging encryption techniques, offering flexible access control to encrypted data, by means of functional relationships between ciphertexts and decryption keys, FS protection was not known to exist.\smallskip In this paper we introduce FS to the powerful setting of \emph{Hierarchical Predicate Encryption (HPE)}, proposed by Okamoto and Takashima (Asiacrypt 2009). Anticipated applications of FS-HPE schemes can be found in searchable encryption and in fully private communication. Considering the dependencies amongst the concepts, our FS-HPE scheme implies forward-secure flavors of Predicate Encryption and (Hierarchical) Attribute-Based Encryption.\smallskip Our FS-HPE scheme guarantees forward security for plaintexts and for attributes that are hidden in HPE ciphertexts. It further allows delegation of decrypting abilities at any point in time, independent of FS time evolution. It realizes zero-inner-product predicates and is proven adaptively secure under standard assumptions. As the ``cross-product" approach taken in FS-HIBE is not directly applicable to the HPE setting, our construction resorts to techniques that are specific to existing HPE schemes and extends them with what can be seen as a reminiscent of binary tree encryption from FS-PKE.
Resumo:
Over the last twenty years, the use of open content licenses has become increasingly and surprisingly popular. The use of such licences challenges the traditional incentive-based model of exclusive rights under copyright. Instead of providing a means to charge for the use of particular works, what seems important is mitigating against potential personal harm to the author and, in some cases, preventing non-consensual commercial exploitation. It is interesting in this context to observe the primacy of what are essentially moral rights over the exclusionary economic rights. The core elements of common open content licences map somewhat closely to continental conceptions of the moral rights of authorship. Most obviously, almost all free software and free culture licences require attribution of authorship. More interestingly, there is a tension between social norms developed in free software communities and those that have emerged in the creative arts over integrity and commercial exploitation. For programmers interested in free software, licence terms that prohibit commercial use or modification are almost completely inconsistent with the ideological and utilitarian values that underpin the movement. For those in the creative industries, on the other hand, non-commercial terms and, to a lesser extent, terms that prohibit all but verbatim distribution continue to play an extremely important role in the sharing of copyright material. While prohibitions on commercial use often serve an economic imperative, there is also a certain personal interest for many creators in avoiding harmful exploitation of their expression – an interest that has sometimes been recognised as forming a component of the moral right of integrity. One particular continental moral right – the right of withdrawal – is present neither in Australian law or in any of the common open content licences. Despite some marked differences, both free software and free culture participants are using contractual methods to articulate the norms of permissible sharing. Legal enforcement is rare and often prohibitively expensive, and the various communities accordingly rely upon shared understandings of acceptable behaviour. The licences that are commonly used represent a formalised expression of these community norms and provide the theoretically enforceable legal baseline that lends them legitimacy. The core terms of these licences are designed primarily to alleviate risk in sharing and minimise transaction costs in sharing and using copyright expression. Importantly, however, the range of available licences reflect different optional balances in the norms of creating and sharing material. Generally, it is possible to see that, stemming particularly from the US, open content licences are fundamentally important in providing a set of normatively accepted copyright balances that reflect the interests sought to be protected through moral rights regimes. As the cost of creation, distribution, storage, and processing of expression continues to fall towards zero, there are increasing incentives to adopt open content licences to facilitate wide distribution and reuse of creative expression. Thinking of these protocols not only as reducing transaction costs but of setting normative principles of participation assists in conceptualising the role of open content licences and the continuing tensions that permeate modern copyright law.
Resumo:
This study proceeds from a central interest in the importance of systematically evaluating operational large-scale integrated information systems (IS) in organisations. The study is conducted within the IS-Impact Research Track at Queensland University of Technology (QUT). The goal of the IS-Impact Track is, "to develop the most widely employed model for benchmarking information systems in organizations for the joint benefit of both research and practice" (Gable et al, 2009). The track espouses programmatic research having the principles of incrementalism, tenacity, holism and generalisability through replication and extension research strategies. Track efforts have yielded the bicameral IS-Impact measurement model; the ‘impact’ half includes Organisational-Impact and Individual-Impact dimensions; the ‘quality’ half includes System-Quality and Information-Quality dimensions. Akin to Gregor’s (2006) analytic theory, the ISImpact model is conceptualised as a formative, multidimensional index and is defined as "a measure at a point in time, of the stream of net benefits from the IS, to date and anticipated, as perceived by all key-user-groups" (Gable et al., 2008, p: 381). The study adopts the IS-Impact model (Gable, et al., 2008) as its core theory base. Prior work within the IS-Impact track has been consciously constrained to Financial IS for their homogeneity. This study adopts a context-extension strategy (Berthon et al., 2002) with the aim "to further validate and extend the IS-Impact measurement model in a new context - i.e. a different IS - Human Resources (HR)". The overarching research question is: "How can the impacts of large-scale integrated HR applications be effectively and efficiently benchmarked?" This managerial question (Cooper & Emory, 1995) decomposes into two more specific research questions – In the new HR context: (RQ1): "Is the IS-Impact model complete?" (RQ2): "Is the ISImpact model valid as a 1st-order formative, 2nd-order formative multidimensional construct?" The study adhered to the two-phase approach of Gable et al. (2008) to hypothesise and validate a measurement model. The initial ‘exploratory phase’ employed a zero base qualitative approach to re-instantiating the IS-Impact model in the HR context. The subsequent ‘confirmatory phase’ sought to validate the resultant hypothesised measurement model against newly gathered quantitative data. The unit of analysis for the study is the application, ‘ALESCO’, an integrated large-scale HR application implemented at Queensland University of Technology (QUT), a large Australian university (with approximately 40,000 students and 5000 staff). Target respondents of both study phases were ALESCO key-user-groups: strategic users, management users, operational users and technical users, who directly use ALESCO or its outputs. An open-ended, qualitative survey was employed in the exploratory phase, with the objective of exploring the completeness and applicability of the IS-Impact model’s dimensions and measures in the new context, and to conceptualise any resultant model changes to be operationalised in the confirmatory phase. Responses from 134 ALESCO users to the main survey question, "What do you consider have been the impacts of the ALESCO (HR) system in your division/department since its implementation?" were decomposed into 425 ‘impact citations.’ Citation mapping using a deductive (top-down) content analysis approach instantiated all dimensions and measures of the IS-Impact model, evidencing its content validity in the new context. Seeking to probe additional (perhaps negative) impacts; the survey included the additional open question "In your opinion, what can be done better to improve the ALESCO (HR) system?" Responses to this question decomposed into a further 107 citations which in the main did not map to IS-Impact, but rather coalesced around the concept of IS-Support. Deductively drawing from relevant literature, and working inductively from the unmapped citations, the new ‘IS-Support’ construct, including the four formative dimensions (i) training, (ii) documentation, (iii) assistance, and (iv) authorisation (each having reflective measures), was defined as: "a measure at a point in time, of the support, the [HR] information system key-user groups receive to increase their capabilities in utilising the system." Thus, a further goal of the study became validation of the IS-Support construct, suggesting the research question (RQ3): "Is IS-Support valid as a 1st-order reflective, 2nd-order formative multidimensional construct?" With the aim of validating IS-Impact within its nomological net (identification through structural relations), as in prior work, Satisfaction was hypothesised as its immediate consequence. The IS-Support construct having derived from a question intended to probe IS-Impacts, too was hypothesised as antecedent to Satisfaction, thereby suggesting the research question (RQ4): "What is the relative contribution of IS-Impact and IS-Support to Satisfaction?" With the goal of testing the above research questions, IS-Impact, IS-Support and Satisfaction were operationalised in a quantitative survey instrument. Partial least squares (PLS) structural equation modelling employing 221 valid responses largely evidenced the validity of the commencing IS-Impact model in the HR context. ISSupport too was validated as operationalised (including 11 reflective measures of its 4 formative dimensions). IS-Support alone explained 36% of Satisfaction; IS-Impact alone 70%; in combination both explaining 71% with virtually all influence of ISSupport subsumed by IS-Impact. Key study contributions to research include: (1) validation of IS-Impact in the HR context, (2) validation of a newly conceptualised IS-Support construct as important antecedent of Satisfaction, and (3) validation of the redundancy of IS-Support when gauging IS-Impact. The study also makes valuable contributions to practice, the research track and the sponsoring organisation.
Resumo:
The health effects of environmental hazards are often examined using time series of the association between a daily response variable (e.g., death) and a daily level of exposure (e.g., temperature). Exposures are usually the average from a network of stations. This gives each station equal importance, and negates the opportunity for some stations to be better measures of exposure. We used a Bayesian hierarchical model that weighted stations using random variables between zero and one. We compared the weighted estimates to the standard model using data on health outcomes (deaths and hospital admissions) and exposures (air pollution and temperature) in Brisbane, Australia. The improvements in model fit were relatively small, and the estimated health effects of pollution were similar using either the standard or weighted estimates. Spatial weighted exposures would be probably more worthwhile when there is either greater spatial detail in the health outcome, or a greater spatial variation in exposure.
Resumo:
The identification of the primary drivers of stock returns has been of great interest to both financial practitioners and academics alike for many decades. Influenced by classical financial theories such as the CAPM (Sharp, 1964; Lintner, 1965) and APT (Ross, 1976), a linear relationship is conventionally assumed between company characteristics as derived from their financial accounts and forward returns. Whilst this assumption may be a fair approximation to the underlying structural relationship, it is often adopted for the purpose of convenience. It is actually quite rare that the assumptions of distributional normality and a linear relationship are explicitly assessed in advance even though this information would help to inform the appropriate choice of modelling technique. Non-linear models have nevertheless been applied successfully to the task of stock selection in the past (Sorensen et al, 2000). However, their take-up by the investment community has been limited despite the fact that researchers in other fields have found them to be a useful way to express knowledge and aid decision-making...
Resumo:
The mining environment, being complex, irregular and time varying, presents a challenging prospect for stereo vision. The objective is to produce a stereo vision sensor suited to close-range scenes consisting primarily of rocks. This sensor should be able to produce a dense depth map within real-time constraints. Speed and robustness are of foremost importance for this investigation. A number of area based matching metrics have been implemented, including the SAD, SSD, NCC, and their zero-meaned versions. The NCC and the zero meaned SAD and SSD were found to produce the disparity maps with the highest proportion of valid matches. The plain SAD and SSD were the least computationally expensive, due to all their operations taking place in integer arithmetic, however, they were extremely sensitive to radiometric distortion. Non-parametric techniques for matching, in particular, the rank and the census transform, have also been investigated. The rank and census transforms were found to be robust with respect to radiometric distortion, as well as being able to produce disparity maps with a high proportion of valid matches. An additional advantage of both the rank and the census transform is their amenability to fast hardware implementation.
Resumo:
In this paper, we analyse the impact of a (small) heterogeneity of jump type on the most simple localized solutions of a 3-component FitzHugh–Nagumo-type system. We show that the heterogeneity can pin a 1-front solution, which travels with constant (non-zero) speed in the homogeneous setting, to a fixed, explicitly determined, distance from the heterogeneity. Moreover, we establish the stability of this heterogeneous pinned 1-front solution. In addition, we analyse the pinning of 1-pulse, or 2-front, solutions. The paper is concluded with simulations in which we consider the dynamics and interactions of N-front patterns in domains with M heterogeneities of jump type (N = 3, 4, M ≥ 1).
Resumo:
Amongst the most prominent uses of Twitter at present is its role in the discussion of widely televised events: Twitter’s own statistics for 2011, for example, list major entertainment spectacles (the MTV Music Awards, the BET Awards) and sports matches (the UEFA Champions League final, the FIFA Women’s World Cup final) amongst the events generating the most tweets per second during the year (Twitter, 2011). User activities during such televised events constitute a specific, unique category of Twitter use, which differs clearly from the other major events which generate a high rate of tweets per second (such as crises and breaking news, from the Japanese earthquake and tsunami to the death of Steve Jobs), as preliminary research has shown. During such major media events, by contrast, Twitter is used most predominantly as a technology of fandom instead: it serves in the first place as a backchannel to television and other streaming audiovisual media, enabling users offer their own running commentary on the universally shared media text of the event broadcast as it unfolds live. Centrally, this communion of fans around the shared text is facilitated by the use of Twitter hashtags – unifying textual markers which are now often promoted to prospective audiences by the broadcasters well in advance of the live event itself. This paper examines the use of Twitter as a technology for the expression of shared fandom in the context of a major, internationally televised annual media event: the Eurovision Song Contest. It constitutes a highly publicised, highly choreographed media spectacle whose eventual outcomes are unknown ahead of time and attracts a diverse international audience. Our analysis draws on comprehensive datasets for the ‘official’ event hashtags, #eurovision, #esc, and #sbseurovision. Using innovative methods which combine qualitative and quantitative approaches to the analysis of Twitter datasets containing several hundreds of thousands, we examine overall patterns of participation to discover how audiences express their fandom throughout the event. Minute-by-minute tracking of Twitter activity during the live broadcasts enables us to identify the most resonant moments during each event; we also examine the networks of interaction between participants to detect thematically or geographically determined clusters of interaction, and to identify the most visible and influential participants in each network. Such analysis is able to provide a unique insight into the use of Twitter as a technology for fandom and for what in cultural studies research is called ‘audiencing’: the public performance of belonging to the distributed audience for a shared media event. Our work thus contributes to the examination of fandom practices led by Henry Jenkins (2006) and other scholars, and points to Twitter as an important new medium facilitating the connection and communion of such fans.
Resumo:
Stronger investor interest in commodities may create closer integration with conventional asset markets. We estimate sudden and gradual changes in correlation between stocks, bonds and commodity futures returns driven by observable financial variables and time, using double smooth transition conditional correlation (DSTCC–GARCH) models. Most correlations begin the 1990s near zero but closer integration emerges around the early 2000s and reaches peaks during the recent crisis. Diversification benefits to investors across equity, bond and stock markets were significantly reduced. Increases in VIX and financial traders’ short open interest raise futures returns volatility for many commodities. Higher VIX also increases commodity returns correlation with equity returns for about half the pairs, indicating closer integration.
Resumo:
YBCO wires which consist of well oriented plate-like fine grains are fabricated using a moving furnace to achieve higher mechanical strength. Melt-texturing experiments have been undertaken on YBCO wires with two different compositions: YBa1.5Cu2.9O7-x, and YBa1.8Cu3.0O7-x. Wires are extruded from a mixture of precursor powders (formed by a coprecipitation process) then textured by firing in a moving furnace. Size of secondary phases such as barium cuprate and copper oxide, and overall composition of the sample affect the orientation of the fine grains. At zero magnetic field, the YBa1.5Cu2.9O7-x wire shows the highest critical current density of 1,450 Acm-2 and 8,770 Acm-2 at 77K and 4.2K, respectively. At 1 T, critical current densities of 30 Acm-2 and 200 Acm-2, respectively, are obtained at 77K and 4.2K. Magnetisation curves are also obtained for one sample to evaluate critical current density using the Bean model. Analysis of the microstructure indicates that the starting composition of the green body significantly affects the achievement of grain alignment via melt-texturing processes.
Resumo:
A new control method for battery storage to maintain acceptable voltage profile in autonomous microgrids is proposed in this article. The proposed battery control ensures that the bus voltages in the microgrid are maintained during disturbances such as load change, loss of micro-sources, or distributed generations hitting power limit. Unlike the conventional storage control based on local measurements, the proposed method is based on an advanced control technique, where the reference power is determined based on the voltage drop profile at the battery bus. An artificial neural network based controller is used to determine the reference power needed for the battery to hold the microgrid voltage within regulation limits. The pattern of drop in the local bus voltage during power imbalance is used to train the controller off-line. During normal operation, the battery floats with the local bus voltage without any power injection. The battery is charged or discharged during the transients with a high gain feedback loop. Depending on the rate of voltage fall, it is switched to power control mode to inject the reference power determined by the proposed controller. After a defined time period, the battery power injection is reduced to zero using slow reverse-droop characteristics, ensuring a slow rate of increase in power demand from the other distributed generations. The proposed control method is simulated for various operating conditions in a microgrid with both inertial and converter interfaced sources. The proposed battery control provides a quick load pick up and smooth load sharing with the other micro-sources in a disturbance. With various disturbances, maximum voltage drop over 8% with conventional energy storage is reduced within 2.5% with the proposed control method.
Resumo:
Scientific and programmatic progress toward the development of a cosmic dust collection facility (CDCF) for the proposed space station is documented. Topics addressed include: trajectory sensor concepts; trajectory accuracy and orbital evolution; CDCF pointing direction; development of capture devices; analytical techniques; programmatic progress; flight opportunities; and facility development.
Resumo:
This paper describes the socio-economic and environmental impacts of battery driven Auto Rickshaw at Rajshahi city in Bangladesh. Unemployment problem is one of the major problems in Bangladesh. The number of unemployed people in Bangladesh is 7 lacks. Auto Rickshaw reduces this unemployment problem near about 2%.In this thesis work various questions were asked to the Auto Rickshaw driver in the different point in the Rajshahi city. Then those data were calculated to know their socio economic condition. The average number of passenger per Auto Rickshaw was determined at various places of Rajshahi city (Talaimari mor, Hadir mor, Alupotti, Shaheb bazar zero point, Shodor Hospital mor, Fire brigade mor, CNB mor, Lakshipur mor, Bondo gate, Bornali, Panir tank, Rail gate, Rail Station, Bhodrar mor, Adorsha School mor). Air pollution is a great threat for human health. One of the major causes of the air pollution is the emission from various vehicles, which are running by the burning of the fossil fuel in different internal combustion(IC) engines. All the data’s about emission from various power plants were collected from internet. Then the amounts of emission (CO2, NOX and PM) from different power plant were calculated in terms of kg/km. The energy required by the Auto Rickshaw per km was also calculated. Then the histogram of emission from different vehicles in terms of kg/km was drawn. By analyzing the data and chart, it was found that, battery driven Auto Rickshaw increases income, social status, comfort and decreases unemployment problems.