80 resultados para commodity spot


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nitrous oxide (N2O) is a major greenhouse gas (GHG) product of intensive agriculture. Fertilizer nitrogen (N) rate is the best single predictor of N2O emissions in row-crop agriculture in the US Midwest. We use this relationship to propose a transparent, scientifically robust protocol that can be utilized by developers of agricultural offset projects for generating fungible GHG emission reduction credits for the emerging US carbon cap and trade market. By coupling predicted N2O flux with the recently developed maximum return to N (MRTN) approach for determining economically profitable N input rates for optimized crop yield, we provide the basis for incentivizing N2O reductions without affecting yields. The protocol, if widely adopted, could reduce N2O from fertilized row-crop agriculture by more than 50%. Although other management and environmental factors can influence N2O emissions, fertilizer N rate can be viewed as a single unambiguous proxy—a transparent, tangible, and readily manageable commodity. Our protocol addresses baseline establishment, additionality, permanence, variability, and leakage, and provides for producers and other stakeholders the economic and environmental incentives necessary for adoption of agricultural N2O reduction offset projects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Identification of hot spots, also known as the sites with promise, black spots, accident-prone locations, or priority investigation locations, is an important and routine activity for improving the overall safety of roadway networks. Extensive literature focuses on methods for hot spot identification (HSID). A subset of this considerable literature is dedicated to conducting performance assessments of various HSID methods. A central issue in comparing HSID methods is the development and selection of quantitative and qualitative performance measures or criteria. The authors contend that currently employed HSID assessment criteria—namely false positives and false negatives—are necessary but not sufficient, and additional criteria are needed to exploit the ordinal nature of site ranking data. With the intent to equip road safety professionals and researchers with more useful tools to compare the performances of various HSID methods and to improve the level of HSID assessments, this paper proposes four quantitative HSID evaluation tests that are, to the authors’ knowledge, new and unique. These tests evaluate different aspects of HSID method performance, including reliability of results, ranking consistency, and false identification consistency and reliability. It is intended that road safety professionals apply these different evaluation tests in addition to existing tests to compare the performances of various HSID methods, and then select the most appropriate HSID method to screen road networks to identify sites that require further analysis. This work demonstrates four new criteria using 3 years of Arizona road section accident data and four commonly applied HSID methods [accident frequency ranking, accident rate ranking, accident reduction potential, and empirical Bayes (EB)]. The EB HSID method reveals itself as the superior method in most of the evaluation tests. In contrast, identifying hot spots using accident rate rankings performs the least well among the tests. The accident frequency and accident reduction potential methods perform similarly, with slight differences explained. The authors believe that the four new evaluation tests offer insight into HSID performance heretofore unavailable to analysts and researchers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Identifying crash “hotspots”, “blackspots”, “sites with promise”, or “high risk” locations is standard practice in departments of transportation throughout the US. The literature is replete with the development and discussion of statistical methods for hotspot identification (HSID). Theoretical derivations and empirical studies have been used to weigh the benefits of various HSID methods; however, a small number of studies have used controlled experiments to systematically assess various methods. Using experimentally derived simulated data—which are argued to be superior to empirical data, three hot spot identification methods observed in practice are evaluated: simple ranking, confidence interval, and Empirical Bayes. Using simulated data, sites with promise are known a priori, in contrast to empirical data where high risk sites are not known for certain. To conduct the evaluation, properties of observed crash data are used to generate simulated crash frequency distributions at hypothetical sites. A variety of factors is manipulated to simulate a host of ‘real world’ conditions. Various levels of confidence are explored, and false positives (identifying a safe site as high risk) and false negatives (identifying a high risk site as safe) are compared across methods. Finally, the effects of crash history duration in the three HSID approaches are assessed. The results illustrate that the Empirical Bayes technique significantly outperforms ranking and confidence interval techniques (with certain caveats). As found by others, false positives and negatives are inversely related. Three years of crash history appears, in general, to provide an appropriate crash history duration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Consumer personal information is now a valuable commodity for most corporations. Concomitant with increased value is the expansion of new legal obligations to protect personal information. Mandatory data breach notification laws are an important new development in this regard. Such laws require a corporation that has suffered a data breach, which involves personal information, such as a computer hacking incident, to notify those persons who may have been affected by the breach. Regulators may also need to be notified. Australia currently does not have a mandatory data breach notification law but this may be about to change. The Australian Law Reform Commission has suggested that a data breach notification scheme be implemented through the Privacy Act 1988 (Cth). However, the notification of data breaches may already be required under the continuous disclosure regime stipulated by the Corporations Act 2001 (Cth) and the Australian Stock Exchange (ASX) Listing Rules. Accordingly, this article examines whether the notification of data breaches is a statutory requirement of the existing continuous disclosure regime and whether the ASX should therefore be notified of such incidents.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new approach that is slowly replacing neoclassical models of economic growth and commodity based industrial activities, knowledge based urban development (KBUD) aims to provide opportunities for citiesw to foster knowledge creation, exchange and innovation, and is based on the concepts of both sustainable urban development and economic prosperity; sustainable uses and protection of natural resources are therefore integral parts of KBUD. As such, stormwater, which has been recognised as one of the main culprits of aquatic ecosystem pollution and as therefore a significant threat to the goal of sustainable urban development, needs to be managed in a manner that produces ecologically sound outcomes. Water sensitive urban design (WSUD) is one of the key responses to the need to better management urban stormwater runoff and supports KBUD by providing an alternative, innovative and effective strategy to traditional stormwater management.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Following the completion of the draft Human Genome in 2001, genomic sequence data is becoming available at an accelerating rate, fueled by advances in sequencing and computational technology. Meanwhile, large collections of astronomical and geospatial data have allowed the creation of virtual observatories, accessible throughout the world and requiring only commodity hardware. Through a combination of advances in data management, data mining and visualization, this infrastructure enables the development of new scientific and educational applications as diverse as galaxy classification and real-time tracking of earthquakes and volcanic plumes. In the present paper, we describe steps taken along a similar path towards a virtual observatory for genomes – an immersive three-dimensional visual navigation and query system for comparative genomic data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many cities worldwide face the prospect of major transformation as the world moves towards a global information order. In this new era, urban economies are being radically altered by dynamic processes of economic and spatial restructuring. The result is the creation of ‘informational cities’ or its new and more popular name, ‘knowledge cities’. For the last two centuries, social production had been primarily understood and shaped by neo-classical economic thought that recognized only three factors of production: land, labor and capital. Knowledge, education, and intellectual capacity were secondary, if not incidental, factors. Human capital was assumed to be either embedded in labor or just one of numerous categories of capital. In the last decades, it has become apparent that knowledge is sufficiently important to deserve recognition as a fourth factor of production. Knowledge and information and the social and technological settings for their production and communication are now seen as keys to development and economic prosperity. The rise of knowledge-based opportunity has, in many cases, been accompanied by a concomitant decline in traditional industrial activity. The replacement of physical commodity production by more abstract forms of production (e.g. information, ideas, and knowledge) has, however paradoxically, reinforced the importance of central places and led to the formation of knowledge cities. Knowledge is produced, marketed and exchanged mainly in cities. Therefore, knowledge cities aim to assist decision-makers in making their cities compatible with the knowledge economy and thus able to compete with other cities. Knowledge cities enable their citizens to foster knowledge creation, knowledge exchange and innovation. They also encourage the continuous creation, sharing, evaluation, renewal and update of knowledge. To compete nationally and internationally, cities need knowledge infrastructures (e.g. universities, research and development institutes); a concentration of well-educated people; technological, mainly electronic, infrastructure; and connections to the global economy (e.g. international companies and finance institutions for trade and investment). Moreover, they must possess the people and things necessary for the production of knowledge and, as importantly, function as breeding grounds for talent and innovation. The economy of a knowledge city creates high value-added products using research, technology, and brainpower. Private and the public sectors value knowledge, spend money on its discovery and dissemination and, ultimately, harness it to create goods and services. Although many cities call themselves knowledge cities, currently, only a few cities around the world (e.g., Barcelona, Delft, Dublin, Montreal, Munich, and Stockholm) have earned that label. Many other cities aspire to the status of knowledge city through urban development programs that target knowledge-based urban development. Examples include Copenhagen, Dubai, Manchester, Melbourne, Monterrey, Singapore, and Shanghai. Knowledge-Based Urban Development To date, the development of most knowledge cities has proceeded organically as a dependent and derivative effect of global market forces. Urban and regional planning has responded slowly, and sometimes not at all, to the challenges and the opportunities of the knowledge city. That is changing, however. Knowledge-based urban development potentially brings both economic prosperity and a sustainable socio-spatial order. Its goal is to produce and circulate abstract work. The globalization of the world in the last decades of the twentieth century was a dialectical process. On one hand, as the tyranny of distance was eroded, economic networks of production and consumption were constituted at a global scale. At the same time, spatial proximity remained as important as ever, if not more so, for knowledge-based urban development. Mediated by information and communication technology, personal contact, and the medium of tacit knowledge, organizational and institutional interactions are still closely associated with spatial proximity. The clustering of knowledge production is essential for fostering innovation and wealth creation. The social benefits of knowledge-based urban development extend beyond aggregate economic growth. On the one hand is the possibility of a particularly resilient form of urban development secured in a network of connections anchored at local, national, and global coordinates. On the other hand, quality of place and life, defined by the level of public service (e.g. health and education) and by the conservation and development of the cultural, aesthetic and ecological values give cities their character and attract or repel the creative class of knowledge workers, is a prerequisite for successful knowledge-based urban development. The goal is a secure economy in a human setting: in short, smart growth or sustainable urban development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cellular response to radiation damage is made by a complex network of pathways and feedback loops whose spatiotemporal organisation is still unclear despite its decisive role in determining the fate of the damaged cell. Revealing the dynamic sequence of the repair proteins is therefore critical in understanding how the DNA repair mechanisms work. There are also still open questions regarding the possible movement of damaged chromatin domains and its role as trigger for lesion recognition and signalling in the DNA repair context. The single-cell approach and the high spatial resolution offered by microbeams provide the perfect tool to study and quantify the dynamic processes associated with the induction and repair of DNA damage. We have followed the development of radiation-induced foci for three DNA damage markers (i.e. γ-H2AX, 53BP1 and hSSB1) using normal fibroblasts (AG01522), human breast adenocarcinoma cells (MCF7) and human fibrosarcoma cells (HT1080) stably transfected with yellow fluorescent protein fusion proteins following irradiation with the QUB X-ray microbeam (carbon X-rays <2 µm spot). The size and intensity of the foci has been analysed as a function of dose and time post-irradiation to investigate the dynamics of the above-mentioned DNA repair processes and monitor the remodelling of chromatin structure that the cell undergoes to deal with DNA damage.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hot spot identification (HSID) plays a significant role in improving the safety of transportation networks. Numerous HSID methods have been proposed, developed, and evaluated in the literature. The vast majority of HSID methods reported and evaluated in the literature assume that crash data are complete, reliable, and accurate. Crash under-reporting, however, has long been recognized as a threat to the accuracy and completeness of historical traffic crash records. As a natural continuation of prior studies, the paper evaluates the influence that under-reported crashes exert on HSID methods. To conduct the evaluation, five groups of data gathered from Arizona Department of Transportation (ADOT) over the course of three years are adjusted to account for fifteen different assumed levels of under-reporting. Three identification methods are evaluated: simple ranking (SR), empirical Bayes (EB) and full Bayes (FB). Various threshold levels for establishing hotspots are explored. Finally, two evaluation criteria are compared across HSID methods. The results illustrate that the identification bias—the ability to correctly identify at risk sites--under-reporting is influenced by the degree of under-reporting. Comparatively speaking, crash under-reporting has the largest influence on the FB method and the least influence on the SR method. Additionally, the impact is positively related to the percentage of the under-reported PDO crashes and inversely related to the percentage of the under-reported injury crashes. This finding is significant because it reveals that despite PDO crashes being least severe and costly, they have the most significant influence on the accuracy of HSID.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We all live in a yellow submarine… When I go to work in the morning, in the office building that hosts our BPM research group, on the way up to our level I come by this big breakout room that hosts a number of computer scientists, working away at the next generation software algorithms and iPad applications (I assume). I have never actually been in that room, but every now and then the door is left ajar for a while and I can spot couches, lots (I mean, lots!) of monitors, the odd scientist, a number of Lara Croft posters, and the usual room equipment you’d probably expect from computer scientists (and, no, it’s not like that evil Dennis guy from the Jurassic Park movie, buried in chips, coke, and flickering code screens… It’s also not like the command room from the Nebuchadnezzar, Neo’s hovercraft in the Matrix movies, although I still strongly believe these green lines of code make a good screensaver).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Identifying, modelling and documenting business processes usually requires the collaboration of many stakeholders that may be spread across companies in inter-organizational business settings. While there are many process modelling tools available, the support they provide for remote collaboration is still limited. This demonstration showcases a novel prototype application that implements collaborative virtual environment and augmented reality technologies to improve remote collaborative process modelling, with an aim to assisting common collaboration tasks by providing an increased sense of immersion in an intuitive shared work and task space. Our tool is easily deployed using open source software, and commodity hardware, and is expected to assist with saving money on travel costs for large scale process modelling projects covering national and international centres within an enterprise.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Accessible housing is a scarce yet much needed commodity in Australia. A national agreement between industry and advocacy groups to a voluntary approach, called the Livable Design program, aims to provide access features in all new housing by 2020. Through a range of awareness raising initiatives, the program is anticipating increased supply by builders and increased demand by home-buyers. However the people who need accessible housing are the least likely and least able to buy it at the point of new sale and average homebuyers do not consider access features as a priority. This approach has not been successful overseas or in Australia in the past. Regulation with incentives supported by education and awareness has provided the best results, yet, regulation typically comes with controversy and resistance from the housing industry. A study is planned to identify how effective the Livable Design program is likely to be, what is likely to hinder it and why regulation is likely to be needed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A pervasive and puzzling feature of banks’ Value-at-Risk (VaR) is its abnormally high level, which leads to excessive regulatory capital. A possible explanation for the tendency of commercial banks to overstate their VaR is that they incompletely account for the diversification effect among broad risk categories (e.g., equity, interest rate, commodity, credit spread, and foreign exchange). By underestimating the diversification effect, bank’s proprietary VaR models produce overly prudent market risk assessments. In this paper, we examine empirically the validity of this hypothesis using actual VaR data from major US commercial banks. In contrast to the VaR diversification hypothesis, we find that US banks show no sign of systematic underestimation of the diversification effect. In particular, diversification effects used by banks is very close to (and quite often larger than) our empirical diversification estimates. A direct implication of this finding is that individual VaRs for each broad risk category, just like aggregate VaRs, are biased risk assessments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This special issue presents an excellent opportunity to study applied epistemology in public policy. This is an important task because the arena of public policy is the social domain in which macro conditions for ‘knowledge work’ and ‘knowledge industries’ are defined and created. We argue that knowledge-related public policy has become overly concerned with creating the politico-economic parameters for the commodification of knowledge. Our policy scope is broader than that of Fuller (1988), who emphasizes the need for a social epistemology of science policy. We extend our focus to a range of policy documents that include communications, science, education and innovation policy (collectively called knowledge-related public policy in acknowledgement of the fact that there is no defined policy silo called ‘knowledge policy’), all of which are central to policy concerned with the ‘knowledge economy’ (Rooney and Mandeville, 1998). However, what we will show here is that, as Fuller (1995) argues, ‘knowledge societies’ are not industrial societies permeated by knowledge, but that knowledge societies are permeated by industrial values. Our analysis is informed by an autopoietic perspective. Methodologically, we approach it from a sociolinguistic position that acknowledges the centrality of language to human societies (Graham, 2000). Here, what we call ‘knowledge’ is posited as a social and cognitive relationship between persons operating on and within multiple social and non-social (or, crudely, ‘physical’) environments. Moreover, knowing, we argue, is a sociolinguistically constituted process. Further, we emphasize that the evaluative dimension of language is most salient for analysing contemporary policy discourses about the commercialization of epistemology (Graham, in press). Finally, we provide a discourse analysis of a sample of exemplary texts drawn from a 1.3 million-word corpus of knowledge-related public policy documents that we compiled from local, state, national and supranational legislatures throughout the industrialized world. Our analysis exemplifies a propensity in policy for resorting to technocratic, instrumentalist and anti-intellectual views of knowledge in policy. We argue that what underpins these patterns is a commodity-based conceptualization of knowledge, which is underpinned by an axiology of narrowly economic imperatives at odds with the very nature of knowledge. The commodity view of knowledge, therefore, is flawed in its ignorance of the social systemic properties of knowing’.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Estimates of the half-life to convergence of prices across a panel of cities are subject to bias from three potential sources: inappropriate cross-sectional aggregation of heterogeneous coefficients, presence of lagged dependent variables in a model with individual fixed effects, and time aggregation of commodity prices. This paper finds no evidence of heterogeneity bias in annual CPI data for 17 U.S. cities from 1918 to 2006, but correcting for the “Nickell bias” and time aggregation bias produces a half-life of 7.5 years, shorter than estimates from previous studies.