841 resultados para Third-party Joinder
Resumo:
While the implementation of the IEC 61850 standard has significantly enhanced the performance of communications in electrical substations, it has also increased the complexity of the system. Subsequently, these added elaborations have introduced new challenges in relation to the skills and tools required for the design, test and maintenance of 61850-compatible substations. This paper describes a practical experience of testing a protection relay using a non-conventional test equipment; in addition, it proposes a third party software technique to reveal the contents of the packets transferred on the substation network. Using this approach, the standard objects can be linked and interpreted to what the end-users normally see in the IED and test equipment proprietary software programs.
Resumo:
We implemented least absolute shrinkage and selection operator (LASSO) regression to evaluate gene effects in genome-wide association studies (GWAS) of brain images, using an MRI-derived temporal lobe volume measure from 729 subjects scanned as part of the Alzheimer's Disease Neuroimaging Initiative (ADNI). Sparse groups of SNPs in individual genes were selected by LASSO, which identifies efficient sets of variants influencing the data. These SNPs were considered jointly when assessing their association with neuroimaging measures. We discovered 22 genes that passed genome-wide significance for influencing temporal lobe volume. This was a substantially greater number of significant genes compared to those found with standard, univariate GWAS. These top genes are all expressed in the brain and include genes previously related to brain function or neuropsychiatric disorders such as MACROD2, SORCS2, GRIN2B, MAGI2, NPAS3, CLSTN2, GABRG3, NRXN3, PRKAG2, GAS7, RBFOX1, ADARB2, CHD4, and CDH13. The top genes we identified with this method also displayed significant and widespread post hoc effects on voxelwise, tensor-based morphometry (TBM) maps of the temporal lobes. The most significantly associated gene was an autism susceptibility gene known as MACROD2.We were able to successfully replicate the effect of the MACROD2 gene in an independent cohort of 564 young, Australian healthy adult twins and siblings scanned with MRI (mean age: 23.8±2.2 SD years). Our approach powerfully complements univariate techniques in detecting influences of genes on the living brain.
No specific role for the manual motor system in processing the meanings of words related to the hand
Resumo:
The present study explored whether semantic and motor systems are functionally interwoven via the use of a dual-task paradigm. According to embodied language accounts that propose an automatic and necessary involvement of the motor system in conceptual processing, concurrent processing of hand-related information should interfere more with hand movements than processing of unrelated body-part (i.e., foot, mouth) information. Across three experiments, 100 right-handed participants performed left- or right-hand tapping movements while repeatedly reading action words related to different body-parts, or different body-part names, in both aloud and silent conditions. Concurrent reading of single words related to specific body-parts, or the same words embedded in sentences differing in syntactic and phonological complexity (to manipulate context-relevant processing), and reading while viewing videos of the actions and body-parts described by the target words (to elicit visuomotor associations) all interfered with right-hand but not left-hand tapping rate. However, this motor interference was not affected differentially by hand-related stimuli. Thus, the results provide no support for proposals that body-part specific resources in cortical motor systems are shared between overt manual movements and meaning-related processing of words related to the hand.
Resumo:
Exotic species dominate many communities; however the functional significance of species’ biogeographic origin remains highly contentious. This debate is fuelled in part by the lack of globally replicated, systematic data assessing the relationship between species provenance, function and response to perturbations. We examined the abundance of native and exotic plant species at 64 grasslands in 13 countries, and at a subset of the sites we experimentally tested native and exotic species responses to two fundamental drivers of invasion, mineral nutrient supplies and vertebrate herbivory. Exotic species are six times more likely to dominate communities than native species. Furthermore, while experimental nutrient addition increases the cover and richness of exotic species, nutrients decrease native diversity and cover. Native and exotic species also differ in their response to vertebrate consumer exclusion. These results suggest that species origin has functional significance, and that eutrophication will lead to increased exotic dominance in grasslands.
Resumo:
For the first decade of its existence, the concept of citizen journalism has described an approach which was seen as a broadening of the participant base in journalistic processes, but still involved only a comparatively small subset of overall society – for the most part, citizen journalists were news enthusiasts and “political junkies” (Coleman, 2006) who, as some exasperated professional journalists put it, “wouldn’t get a job at a real newspaper” (The Australian, 2007), but nonetheless followed many of the same journalistic principles. The investment – if not of money, then at least of time and effort – involved in setting up a blog or participating in a citizen journalism Website remained substantial enough to prevent the majority of Internet users from engaging in citizen journalist activities to any significant extent; what emerged in the form of news blogs and citizen journalism sites was a new online elite which for some time challenged the hegemony of the existing journalistic elite, but gradually also merged with it. The mass adoption of next-generation social media platforms such as Facebook and Twitter, however, has led to the emergence of a new wave of quasi-journalistic user activities which now much more closely resemble the “random acts of journalism” which JD Lasica envisaged in 2003. Social media are not exclusively or even predominantly used for citizen journalism; instead, citizen journalism is now simply a by-product of user communities engaging in exchanges about the topics which interest them, or tracking emerging stories and events as they happen. Such platforms – and especially Twitter with its system of ad hoc hashtags that enable the rapid exchange of information about issues of interest – provide spaces for users to come together to “work the story” through a process of collaborative gatewatching (Bruns, 2005), content curation, and information evaluation which takes place in real time and brings together everyday users, domain experts, journalists, and potentially even the subjects of the story themselves. Compared to the spaces of news blogs and citizen journalism sites, but also of conventional online news Websites, which are controlled by their respective operators and inherently position user engagement as a secondary activity to content publication, these social media spaces are centred around user interaction, providing a third-party space in which everyday as well as institutional users, laypeople as well as experts converge without being able to control the exchange. Drawing on a number of recent examples, this article will argue that this results in a new dynamic of interaction and enables the emergence of a more broadly-based, decentralised, second wave of citizen engagement in journalistic processes.
Resumo:
Developing nano/micro-structures which can effectively upgrade the intriguing properties of electrode materials for energy storage devices is always a key research topic. Ultrathin nanosheets were proved to be one of the potential nanostructures due to their high specific surface area, good active contact areas and porous channels. Herein, we report a unique hierarchical micro-spherical morphology of well-stacked and completely miscible molybdenum disulfide (MoS2) nanosheets and graphene sheets, were successfully synthesized via a simple and industrial scale spray-drying technique to take the advantages of both MoS2 and graphene in terms of their high practical capacity values and high electronic conductivity, respectively. Computational studies were performed to understand the interfacial behaviour of MoS2 and graphene, which proves high stability of the composite with high interfacial binding energy (−2.02 eV) among them. Further, the lithium and sodium storage properties have been tested and reveal excellent cyclic stability over 250 and 500 cycles, respectively, with the highest initial capacity values of 1300 mAh g−1 and 640 mAh g−1 at 0.1 A g−1.
Resumo:
This article examines the emerging area of civic crowdfunding, a subset of crowdfunding, as a means of financing public interest environmental litigation. The literature surrounding civic crowdfunding and third party litigation funding is currently underdeveloped. The link between those areas and public interest environmental litigation takes a further step into the unknown. As a case study, the Sea Dumping Case presents exciting opportunities for civil society and access to justice, but further research is needed before any firm conclusions can be drawn.
Resumo:
Developed economies are moving from an economy of corporations to an economy of people. More than ever, people produce and share value amongst themselves, and create value for corporations through co-creation and by sharing their data. This data remains in the hands of corporations and governments, but people want to regain control. Digital identity 3.0 gives people that control, and much more. In this paper we describe a concept for a digital identity platform that substantially goes beyond common concepts providing authentication services. Instead, the notion of digital identity 3.0 empowers people to decide who creates, updates, reads and deletes their data, and to bring their own data into interactions with organisations, governments and peers. To the extent that the user allows, this data is updated and expanded based on automatic, integrated and predictive learning, enabling trusted third party providers (e.g., retailers, banks, public sector) to proactively provide services. Consumers can also add to their digital identity desired meta-data and attribute values allowing them to design their own personal data record and to facilitate individualised experiences. We discuss the essential features of digital identity 3.0, reflect on relevant stakeholders and outline possible usage scenarios in selected industries.
Resumo:
In the mining optimisation literature, most researchers focused on two strategic-level and tactical-level open-pit mine optimisation problems, which are respectively termed ultimate pit limit (UPIT) or constrained pit limit (CPIT). However, many researchers indicate that the substantial numbers of variables and constraints in real-world instances (e.g., with 50-1000 thousand blocks) make the CPIT’s mixed integer programming (MIP) model intractable for use. Thus, it becomes a considerable challenge to solve the large scale CPIT instances without relying on exact MIP optimiser as well as the complicated MIP relaxation/decomposition methods. To take this challenge, two new graph-based algorithms based on network flow graph and conjunctive graph theory are developed by taking advantage of problem properties. The performance of our proposed algorithms is validated by testing recent large scale benchmark UPIT and CPIT instances’ datasets of MineLib in 2013. In comparison to best known results from MineLib, it is shown that the proposed algorithms outperform other CPIT solution approaches existing in the literature. The proposed graph-based algorithms leads to a more competent mine scheduling optimisation expert system because the third-party MIP optimiser is no longer indispensable and random neighbourhood search is not necessary.
Resumo:
Nanohybrids consisting of both carbon and pseudocapacitive metal oxides are promising as high-performance electrodes to meet the key energy and power requirements of supercapacitors. However, the development of high-performance nanohybrids with controllable size, density, composition and morphology remains a formidable challenge. Here, we present a simple and robust approach to integrating manganese oxide (MnOx) nanoparticles onto flexible graphite paper using an ultrathin carbon nanotube/reduced graphene oxide (CNT/RGO) supporting layer. Supercapacitor electrodes employing the MnOx/CNT/RGO nanohybrids without any conductive additives or binders yield a specific capacitance of 1070 F g−1 at 10 mV s−1, which is among the highest values reported for a range of hybrid structures and is close to the theoretical capacity of MnOx. Moreover, atmospheric-pressure plasmas are used to functionalize the CNT/RGO supporting layer to improve the adhesion of MnOx nanoparticles, which results in theimproved cycling stability of the nanohybrid electrodes. These results provide information for the utilization of nanohybrids and plasma-related effects to synergistically enhance the performance of supercapacitors and may create new opportunities in areas such as catalysts, photosynthesis and electrochemical sensors
Resumo:
Patents provide monopoly rights to patent holders. There are safeguards in patent regime to ensure that exclusive right of the patent holder is not misused. Compulsory licensing is one of the safeguards provided under TRIPS using which patent granting state may allow a third party to exploit the invention without patent holder’s consent upon terms and conditions decided by the government. This concept existed since 1623 and was not introduced by TRIPS for the first time. But this mechanism has undergone significant changes especially in post-TRIPS era. History of evolution of compulsory licensing is one of the least explored areas of intellectual property law. This paper undertakes an analysis of different phases in the evolution of the compulsory licensing mechanism and sheds light on reasons behind developments especially after TRIPS.
Resumo:
Phenotypic convergence is thought to be driven by parallel substitutions coupled with natural selection at the sequence level. Multiple independent evolutionary transitions of mammals to an aquatic environment offer an opportunity to test this thesis. Here, whole genome alignment of coding sequences identified widespread parallel amino acid substitutions in marine mammals; however, the majority of these changes were not unique to these animals. Conversely, we report that candidate aquatic adaptation genes, identified by signatures of likelihood convergence and/or elevated ratio of nonsynonymous to synonymous nucleotide substitution rate, are characterized by very few parallel substitutions and exhibit distinct sequence changes in each group. Moreover, no significant positive correlation was found between likelihood convergence and positive selection in all three marine lineages. These results suggest that convergence in protein coding genes associated with aquatic lifestyle is mainly characterized by independent substitutions and relaxed negative selection.
Resumo:
The Australian Naturalistic Driving Study (ANDS), a ground-breaking study of Australian driver behaviour and performance, was officially launched on April 21st, 2015 at UNSW. The ANDS project will provide a realistic perspective on the causes of vehicle crashes and near miss crash events, along with the roles speeding, distraction and other factors have on such events. A total of 360 volunteer drivers across NSW and Victoria - 180 in NSW and 180 in Victoria - will be monitored by a Data Acquisition System (DAS) recording continuously for 4 months their driving behaviour using a suite of cameras and sensors. Participants’ driving behaviour (e.g. gaze), the behaviour of their vehicle (e.g. speed, lane position) and the behaviour of other road users with whom they interact in normal and safety-critical situations will be recorded. Planning of the ANDS commenced over two years ago in June 2013 when the Multi-Institutional Agreement for a grant supporting the equipment purchase and assembly phase was signed by parties involved in this large scale $4 million study (5 university accident research centres, 3 government regulators, 2 third party insurers and 2 industry partners). The program’s second development phase commenced a year later in June 2014 after a second grant was awarded. This paper presents an insider's view into that two year process leading up to the launch, and outlines issues that arose in the set-up phase of the study and how these were addressed. This information will be useful to other organisations considering setting up an NDS.
Resumo:
Melanopsin containing intrinsically photosensitive Retinal Ganglion cells (ipRGCs) mediate the pupil light reflex (PLR) during light onset and at light offset (the post-illumination pupil response, PIPR). Recent evidence shows that the PLR and PIPR can provide non-invasive, objective markers of age-related retinal and optic nerve disease, however there is no consensus on the effects of healthy ageing or refractive error on the ipRGC mediated pupil function. Here we isolated melanopsin contributions to the pupil control pathway in 59 human participants with no ocular pathology across a range of ages and refractive errors. We show that there is no effect of age or refractive error on ipRGC inputs to the human pupil control pathway. The stability of the ipRGC mediated pupil response across the human lifespan provides a functional correlate of their robustness observed during ageing in rodent models.
Resumo:
The 2008 US election has been heralded as the first presidential election of the social media era, but took place at a time when social media were still in a state of comparative infancy; so much so that the most important platform was not Facebook or Twitter, but the purpose-built campaign site my.barackobama.com, which became the central vehicle for the most successful electoral fundraising campaign in American history. By 2012, the social media landscape had changed: Facebook and, to a somewhat lesser extent, Twitter are now well-established as the leading social media platforms in the United States, and were used extensively by the campaign organisations of both candidates. As third-party spaces controlled by independent commercial entities, however, their use necessarily differs from that of home-grown, party-controlled sites: from the point of view of the platform itself, a @BarackObama or @MittRomney is technically no different from any other account, except for the very high follower count and an exceptional volume of @mentions. In spite of the significant social media experience which Democrat and Republican campaign strategists had already accumulated during the 2008 campaign, therefore, the translation of such experience to the use of Facebook and Twitter in their 2012 incarnations still required a substantial amount of new work, experimentation, and evaluation. This chapter examines the Twitter strategies of the leading accounts operated by both campaign headquarters: the ‘personal’ candidate accounts @BarackObama and @MittRomney as well as @JoeBiden and @PaulRyanVP, and the campaign accounts @Obama2012 and @TeamRomney. Drawing on datasets which capture all tweets from and at these accounts during the final months of the campaign (from early September 2012 to the immediate aftermath of the election night), we reconstruct the campaigns’ approaches to using Twitter for electioneering from the quantitative and qualitative patterns of their activities, and explore the resonance which these accounts have found with the wider Twitter userbase. A particular focus of our investigation in this context will be on the tweeting styles of these accounts: the mixture of original messages, @replies, and retweets, and the level and nature of engagement with everyday Twitter followers. We will examine whether the accounts chose to respond (by @replying) to the messages of support or criticism which were directed at them, whether they retweeted any such messages (and whether there was any preferential retweeting of influential or – alternatively – demonstratively ordinary users), and/or whether they were used mainly to broadcast and disseminate prepared campaign messages. Our analysis will highlight any significant differences between the accounts we examine, trace changes in style over the course of the final campaign months, and correlate such stylistic differences with the respective electoral positioning of the candidates. Further, we examine the use of these accounts during moments of heightened attention (such as the presidential and vice-presidential debates, or in the context of controversies such as that caused by the publication of the Romney “47%” video; additional case studies may emerge over the remainder of the campaign) to explore how they were used to present or defend key talking points, and exploit or avert damage from campaign gaffes. A complementary analysis of the messages directed at the campaign accounts (in the form of @replies or retweets) will also provide further evidence for the extent to which these talking points were picked up and disseminated by the wider Twitter population. Finally, we also explore the use of external materials (links to articles, images, videos, and other content on the campaign sites themselves, in the mainstream media, or on other platforms) by the campaign accounts, and the resonance which these materials had with the wider follower base of these accounts. This provides an indication of the integration of Twitter into the overall campaigning process, by highlighting how the platform was used as a means of encouraging the viral spread of campaign propaganda (such as advertising materials) or of directing user attention towards favourable media coverage. By building on comprehensive, large datasets of Twitter activity (as of early October, our combined datasets comprise some 3.8 million tweets) which we process and analyse using custom-designed social media analytics tools, and by using our initial quantitative analysis to guide further qualitative evaluation of Twitter activity around these campaign accounts, we are able to provide an in-depth picture of the use of Twitter in political campaigning during the 2012 US election which will provide detailed new insights social media use in contemporary elections. This analysis will then also be able to serve as a touchstone for the analysis of social media use in subsequent elections, in the USA as well as in other developed nations where Twitter and other social media platforms are utilised in electioneering.