249 resultados para key exhaustion


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Patents provide monopoly rights to patent holders. There are safeguards in patent regime to ensure that exclusive right of the patent holder is not misused. Compulsory licensing is one of the safeguards provided under TRIPS using which patent granting state may allow a third party to exploit the invention without patent holder’s consent upon terms and conditions decided by the government. This concept existed since 1623 and was not introduced by TRIPS for the first time. But this mechanism has undergone significant changes especially in post-TRIPS era. History of evolution of compulsory licensing is one of the least explored areas of intellectual property law. This paper undertakes an analysis of different phases in the evolution of the compulsory licensing mechanism and sheds light on reasons behind developments especially after TRIPS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Rupture of vulnerable atheromatous plaque in the carotid and coronary arteries often leads to stroke and heart attack respectively. The role of calcium deposition and its contribution to plaque stability is controversial. This study uses both an idealized and a patient-specific model to evaluate the effect of a calcium deposit on the stress distribution within an atheromatous plaque. Methods: Using a finite-element method, structural analysis was performed on an idealized plaque model and the location of a calcium deposit within it was varied. In addition to the idealized model, in vivo high-resolution MR imaging was performed on 3 patients with carotid atheroma and stress distributions were generated. The individual plaques were chosen as they had calcium at varying locations with respect to the lumen and the fibrous cap. Results: The predicted maximum stress was increased by 47.5% when the calcium deposit was located in the thin fibrous cap in the model when compared with that in a model without a deposit. The result of adding a calcium deposit either to the lipid core or remote from the lumen resulted in almost no increase in maximal stress. Conclusion: Calcification at the thin fibrous cap may result in high stress concentrations, ultimately increasing the risk of plaque rupture. Assessing the location of calcification may, in the future, aid in the risk stratification of patients with carotid stenosis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the development of statistical models for prediction of constituent concentration of riverine pollutants, which is a key step in load estimation from frequent flow rate data and less frequently collected concentration data. We consider how to capture the impacts of past flow patterns via the average discounted flow (ADF) which discounts the past flux based on the time lapsed - more recent fluxes are given more weight. However, the effectiveness of ADF depends critically on the choice of the discount factor which reflects the unknown environmental cumulating process of the concentration compounds. We propose to choose the discount factor by maximizing the adjusted R-2 values or the Nash-Sutcliffe model efficiency coefficient. The R2 values are also adjusted to take account of the number of parameters in the model fit. The resulting optimal discount factor can be interpreted as a measure of constituent exhaustion rate during flood events. To evaluate the performance of the proposed regression estimators, we examine two different sampling scenarios by resampling fortnightly and opportunistically from two real daily datasets, which come from two United States Geological Survey (USGS) gaging stations located in Des Plaines River and Illinois River basin. The generalized rating-curve approach produces biased estimates of the total sediment loads by -30% to 83%, whereas the new approaches produce relatively much lower biases, ranging from -24% to 35%. This substantial improvement in the estimates of the total load is due to the fact that predictability of concentration is greatly improved by the additional predictors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider estimating the total load from frequent flow data but less frequent concentration data. There are numerous load estimation methods available, some of which are captured in various online tools. However, most estimators are subject to large biases statistically, and their associated uncertainties are often not reported. This makes interpretation difficult and the estimation of trends or determination of optimal sampling regimes impossible to assess. In this paper, we first propose two indices for measuring the extent of sampling bias, and then provide steps for obtaining reliable load estimates that minimizes the biases and makes use of informative predictive variables. The key step to this approach is in the development of an appropriate predictive model for concentration. This is achieved using a generalized rating-curve approach with additional predictors that capture unique features in the flow data, such as the concept of the first flush, the location of the event on the hydrograph (e.g. rise or fall) and the discounted flow. The latter may be thought of as a measure of constituent exhaustion occurring during flood events. Forming this additional information can significantly improve the predictability of concentration, and ultimately the precision with which the pollutant load is estimated. We also provide a measure of the standard error of the load estimate which incorporates model, spatial and/or temporal errors. This method also has the capacity to incorporate measurement error incurred through the sampling of flow. We illustrate this approach for two rivers delivering to the Great Barrier Reef, Queensland, Australia. One is a data set from the Burdekin River, and consists of the total suspended sediment (TSS) and nitrogen oxide (NO(x)) and gauged flow for 1997. The other dataset is from the Tully River, for the period of July 2000 to June 2008. For NO(x) Burdekin, the new estimates are very similar to the ratio estimates even when there is no relationship between the concentration and the flow. However, for the Tully dataset, by incorporating the additional predictive variables namely the discounted flow and flow phases (rising or recessing), we substantially improved the model fit, and thus the certainty with which the load is estimated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There are numerous load estimation methods available, some of which are captured in various online tools. However, most estimators are subject to large biases statistically, and their associated uncertainties are often not reported. This makes interpretation difficult and the estimation of trends or determination of optimal sampling regimes impossible to assess. In this paper, we first propose two indices for measuring the extent of sampling bias, and then provide steps for obtaining reliable load estimates by minimizing the biases and making use of possible predictive variables. The load estimation procedure can be summarized by the following four steps: - (i) output the flow rates at regular time intervals (e.g. 10 minutes) using a time series model that captures all the peak flows; - (ii) output the predicted flow rates as in (i) at the concentration sampling times, if the corresponding flow rates are not collected; - (iii) establish a predictive model for the concentration data, which incorporates all possible predictor variables and output the predicted concentrations at the regular time intervals as in (i), and; - (iv) obtain the sum of all the products of the predicted flow and the predicted concentration over the regular time intervals to represent an estimate of the load. The key step to this approach is in the development of an appropriate predictive model for concentration. This is achieved using a generalized regression (rating-curve) approach with additional predictors that capture unique features in the flow data, namely the concept of the first flush, the location of the event on the hydrograph (e.g. rise or fall) and cumulative discounted flow. The latter may be thought of as a measure of constituent exhaustion occurring during flood events. The model also has the capacity to accommodate autocorrelation in model errors which are the result of intensive sampling during floods. Incorporating this additional information can significantly improve the predictability of concentration, and ultimately the precision with which the pollutant load is estimated. We also provide a measure of the standard error of the load estimate which incorporates model, spatial and/or temporal errors. This method also has the capacity to incorporate measurement error incurred through the sampling of flow. We illustrate this approach using the concentrations of total suspended sediment (TSS) and nitrogen oxide (NOx) and gauged flow data from the Burdekin River, a catchment delivering to the Great Barrier Reef. The sampling biases for NOx concentrations range from 2 to 10 times indicating severe biases. As we expect, the traditional average and extrapolation methods produce much higher estimates than those when bias in sampling is taken into account.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 2008 US election has been heralded as the first presidential election of the social media era, but took place at a time when social media were still in a state of comparative infancy; so much so that the most important platform was not Facebook or Twitter, but the purpose-built campaign site my.barackobama.com, which became the central vehicle for the most successful electoral fundraising campaign in American history. By 2012, the social media landscape had changed: Facebook and, to a somewhat lesser extent, Twitter are now well-established as the leading social media platforms in the United States, and were used extensively by the campaign organisations of both candidates. As third-party spaces controlled by independent commercial entities, however, their use necessarily differs from that of home-grown, party-controlled sites: from the point of view of the platform itself, a @BarackObama or @MittRomney is technically no different from any other account, except for the very high follower count and an exceptional volume of @mentions. In spite of the significant social media experience which Democrat and Republican campaign strategists had already accumulated during the 2008 campaign, therefore, the translation of such experience to the use of Facebook and Twitter in their 2012 incarnations still required a substantial amount of new work, experimentation, and evaluation. This chapter examines the Twitter strategies of the leading accounts operated by both campaign headquarters: the ‘personal’ candidate accounts @BarackObama and @MittRomney as well as @JoeBiden and @PaulRyanVP, and the campaign accounts @Obama2012 and @TeamRomney. Drawing on datasets which capture all tweets from and at these accounts during the final months of the campaign (from early September 2012 to the immediate aftermath of the election night), we reconstruct the campaigns’ approaches to using Twitter for electioneering from the quantitative and qualitative patterns of their activities, and explore the resonance which these accounts have found with the wider Twitter userbase. A particular focus of our investigation in this context will be on the tweeting styles of these accounts: the mixture of original messages, @replies, and retweets, and the level and nature of engagement with everyday Twitter followers. We will examine whether the accounts chose to respond (by @replying) to the messages of support or criticism which were directed at them, whether they retweeted any such messages (and whether there was any preferential retweeting of influential or – alternatively – demonstratively ordinary users), and/or whether they were used mainly to broadcast and disseminate prepared campaign messages. Our analysis will highlight any significant differences between the accounts we examine, trace changes in style over the course of the final campaign months, and correlate such stylistic differences with the respective electoral positioning of the candidates. Further, we examine the use of these accounts during moments of heightened attention (such as the presidential and vice-presidential debates, or in the context of controversies such as that caused by the publication of the Romney “47%” video; additional case studies may emerge over the remainder of the campaign) to explore how they were used to present or defend key talking points, and exploit or avert damage from campaign gaffes. A complementary analysis of the messages directed at the campaign accounts (in the form of @replies or retweets) will also provide further evidence for the extent to which these talking points were picked up and disseminated by the wider Twitter population. Finally, we also explore the use of external materials (links to articles, images, videos, and other content on the campaign sites themselves, in the mainstream media, or on other platforms) by the campaign accounts, and the resonance which these materials had with the wider follower base of these accounts. This provides an indication of the integration of Twitter into the overall campaigning process, by highlighting how the platform was used as a means of encouraging the viral spread of campaign propaganda (such as advertising materials) or of directing user attention towards favourable media coverage. By building on comprehensive, large datasets of Twitter activity (as of early October, our combined datasets comprise some 3.8 million tweets) which we process and analyse using custom-designed social media analytics tools, and by using our initial quantitative analysis to guide further qualitative evaluation of Twitter activity around these campaign accounts, we are able to provide an in-depth picture of the use of Twitter in political campaigning during the 2012 US election which will provide detailed new insights social media use in contemporary elections. This analysis will then also be able to serve as a touchstone for the analysis of social media use in subsequent elections, in the USA as well as in other developed nations where Twitter and other social media platforms are utilised in electioneering.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis evaluates the effectiveness of the prescribed design and distribution requirements of the Australian Government's home loan key facts sheets (KFS) aimed at helping borrowers compare loan costs. The findings show that despite effectively improving borrower decision-making, few borrowers were aware of their existence and function. It was also demonstrated that KFS have had limited market impact over the four year window since introduction, likely due to the requirement that KFS provision is not required unless formally requested by a borrower. Recommendations include transferring the burden of disclosure to lenders in the first instance to address this information gap.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Australian government has recognised the importance of early childhood education and care (ECEC) in recent years. With over one million Australian children accessing early childhood education provision every day (Productivity Commission, 2014), today’s children are a generation who spend a large part of their early years in some form of out-of-home child care. Early chapters in this text have discussed a range of people, theories and approaches that inform the development of ECEC. Early childhood pedagogical practice is an eclectic mix of these ideas. This chapter begins with an overview of the ways young children learn in early childhood education, highlighting play-based learning as a pedagogical response to our understandings about children. Next the chapter outlines areas that have more recently influenced ECEC including international models of early childhood education, neuroscience, studies of young children, economic research and social justice principles. Drawing on the reflections of educators working in various ECEC contexts, the chapter then presents four topics encountered by educators as part of their everyday work with diverse communities. These topics include: • the educational program for children in the early years • relationships and partnerships with diverse families • professional accountabilities, and • changing constructions of childhood.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Security models for two-party authenticated key exchange (AKE) protocols have developed over time to capture the security of AKE protocols even when the adversary learns certain secret values. Increased granularity of security can be modelled by considering partial leakage of secrets in the manner of models for leakage-resilient cryptography, designed to capture side-channel attacks. In this work, we use the strongest known partial-leakage-based security model for key exchange protocols, namely continuous after-the-fact leakage eCK (CAFL-eCK) model. We resolve an open problem by constructing the first concrete two-pass leakage-resilient key exchange protocol that is secure in the CAFL-eCK model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The role of the creative industries – arts and artists – in helping to drive the changes in laws and behaviours that are necessary to tackle climate change, while not superficially obvious, is a deep one. Arts and artists of all kinds, as cultural practitioners, have been closely entwined with social change and social control since time immemorial, in large part because they help shape our understanding of the world, framing ideas, prefiguring change, and opening hearts and minds to new ways of thinking. They have played a major role in campaigns for law reform on many issues, and climate change should be no exception. Indeed, with climate change increasingly being seen as a deeply cultural issue, and its solutions as cultural ones to do with changing the way we understand our world and our place in it, the role of cultural practitioners in helping to address it should also increasingly be seen as central. It is curious, then, how comparatively little artistic engagement with climate change has taken place, how little engagement with the arts the climate movement has attempted, and how little theoretical and critical analysis has been undertaken on the role of the creative arts in climate change action. Through a literature review and a series of interviews with individuals working in relevant fields in Australia, this study examines and evaluates the role of the creative industries in climate change action and places it in a historical and theoretical context. It covers examples of the kind of artistic and activist collaborations that have been undertaken, the different roles in communication, campaigning for law reform, and deep culture change that arts and artists can play, and the risks and dangers inherent in the involvement of artists, both to climate change action and to the artist. It concludes that, despite the risks, a deeper and more thoughtful engagement of and by the creative industries in climate action would not only be useful but is perhaps vital to the success of the endeavours.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Older populations are more likely to have multiple co-morbid diseases that require multiple treatments, which make them a large consumer of medications. As a person grows older, their ability to tolerate medications becomes less due to age-related changes in pharmacokinetics and pharmacodynamics often heading along a path that leads to frailty. Frail older persons often have multiple co-morbidities with signs of impairment in activities of daily living. Prescribing drugs for these vulnerable individuals is difficult and is a potentially unsafe activity. Inappropriate prescribing in older population can be detected using explicit (criterion-based) or implicit (judgment-based) criteria. Unfortunately, most current therapeutic guidelines are applicable only to healthy older adults and cannot be generalized to frail patients. These discrepancies should be addressed either by developing new criteria or by refining the existing tools for frail older people. The first and foremost step is to identify the frail patient in clinical practice by applying clinically validated tools. Once the frail patient has been identified, there is a need for specific measures or criteria to assess appropriateness of therapy that consider such factors as quality of life, functional status and remaining life expectancy and thus modified goals of care.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

UNCITRAL’s operation as a subsidiary of the UN General Assembly, tasked to unify and harmonise international trade law is a necessary and indispensable element of the UN’s mandate to maintain international peace and security. Strong legal frameworks which are compatible with those of international trading partners often accompany accelerated growth in economic capacity and stability. Over time, access to markets and resultant growth in economic and human development creates a disincentive for instability as incomes and standards of living rise. Human and economic development, facilitated by a modernised and just legal framework that is available to the broadest range of recipients goes hand in hand with the maintenance of domestic and regional peace, particularly in regions such as the ASEAN , one of the fastest growing in the world covering approximately 30% of global population and with a number of strong global economic neighbours including Japan, Korea, China (to the north), Australia (to the south) and Singapore (to the west). In an increasingly interconnected world, the ability of government, enterprise and individuals to participate in the global supply chain offers opportunities for economic growth and development. Over its almost 50 years of operations, UNCITRAL has produced a range of important texts that are designed to underpin world trade. A key implicit assumption underpinning the development of UNCITRAL texts is that the texts, once adopted can and will be applied in adopting states.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The concept of cloud computing services (CCS) is appealing to small and medium enterprises (SMEs). However, while there is a significant push by various authorities on SMEs to adopt the CCS, knowledge of the key considerations to adopt the CCS is very limited. We use the technology-organization-environment (TOE) framework to suggest that a strategic and incremental intent, understanding the organizational structure and culture, understanding the external factors, and consideration of the human resource capacity can contribute to sustainable business value from CCS. Using survey data, we find evidence of a positive association between these considerations and the CCS-related business objectives. We also find evidence of positive association between the CCS-related business objectives and CCS-related financial objectives. The results suggest that the proposed considerations can ensure sustainable business value from the CCS. This study provides guidance to SMEs on a path to adopting the CCS with the intention of a long-term commitment and achieving sustainable business value from these services.