137 resultados para Classic marxism
Resumo:
Australia is in the process of making the most important change to its health care system since the implementation of Medicare.1 We agree with Cameron and Cooke that there are important lessons for Australia from the implementation of the 4-hour rule in the United Kingdom. As in Robert Zemeckis’s 1985 movie classic, Back to the future, the old question of “If I had the opportunity to do something again, what would I have done differently?” applies. We challenge the assumption that Australia is embarking on something that the UK has recently abandoned. The UK has not actually abandoned the 4-hour rule but expanded it into a suite of eight indicators that include three time-based measures, including total time in the emergency department (ED).
Resumo:
Make-buy decisions are an important aspect of the overall strategic plans for most firms, and the introduction of a new and potentially radical technology into an industry should therefore be a cue for managers to review their make-buy policies. Should a company make in-house the components and processes underpinning the technology, or should it buy them from an outside supplier? Earlier attempts to answer this question may have failed to agree on a single verdict because they have overlooked two important market forces: supplier relations and industry clockspeed. Based on an intensive three-year study at the University of Cambridge which analyzed supply chain management practices from a broad range of manufacturers around the world, this book helps to resolve this classic technology outsourcing dilemma and gives managers the tools they will need to determine if they should make or buy the components and processes that go into a potentially radical innovation.
Resumo:
Textual document set has become an important and rapidly growing information source in the web. Text classification is one of the crucial technologies for information organisation and management. Text classification has become more and more important and attracted wide attention of researchers from different research fields. In this paper, many feature selection methods, the implement algorithms and applications of text classification are introduced firstly. However, because there are much noise in the knowledge extracted by current data-mining techniques for text classification, it leads to much uncertainty in the process of text classification which is produced from both the knowledge extraction and knowledge usage, therefore, more innovative techniques and methods are needed to improve the performance of text classification. It has been a critical step with great challenge to further improve the process of knowledge extraction and effectively utilization of the extracted knowledge. Rough Set decision making approach is proposed to use Rough Set decision techniques to more precisely classify the textual documents which are difficult to separate by the classic text classification methods. The purpose of this paper is to give an overview of existing text classification technologies, to demonstrate the Rough Set concepts and the decision making approach based on Rough Set theory for building more reliable and effective text classification framework with higher precision, to set up an innovative evaluation metric named CEI which is very effective for the performance assessment of the similar research, and to propose a promising research direction for addressing the challenging problems in text classification, text mining and other relative fields.
Resumo:
Ever since sodium fluorescein (‘fluorescein’ [FL]) was first used to investigate the ocular surface over a century ago, the term ‘staining’ has been taken to mean the presence of ocular surface fluorescence [1]. This term has not been necessarily taken to infer any particular mechanism of causation, and indeed, can be attributed to a variety of possible aetiologies [2]. In recent times, there has been considerable interest in a form of ocular surface fluorescence seen in association with the use of certain combinations of soft contact lenses and multipurpose solutions. The first clinical account of this phenomenon was reported by Jones et al. [3], which was followed by a more formal investigation by the same author in 2002 [4]. Jones et al described this appearance as a ‘classic solution-based toxicity reaction’. Subsequently, this appearance has come to be known as ‘solution-induced corneal staining’ or more recently by the acronym ‘SICS’ [5]. The term SICS is potentially problematic in that from a cell biology point of view, there is an inference that ‘staining’ means the entry of a dye into corneal epithelial cells. Morgan and Maldonado-Codina [2] noted there was no foundation of solid scientific literature underpinning our understanding of the true basic causative mechanisms of this phenomenon; since that time, further work has been published in this field [6] and [7] but questions still remain about the precise aetiology of this phenomenon...
Resumo:
We present a technique for delegating a short lattice basis that has the advantage of keeping the lattice dimension unchanged upon delegation. Building on this result, we construct two new hierarchical identity-based encryption (HIBE) schemes, with and without random oracles. The resulting systems are very different from earlier lattice-based HIBEs and in some cases result in shorter ciphertexts and private keys. We prove security from classic lattice hardness assumptions.
Resumo:
In this work, we summarise the development of a ranking principle based on quantum probability theory, called the Quantum Probability Ranking Principle (QPRP), and we also provide an overview of the initial experiments performed employing the QPRP. The main difference between the QPRP and the classic Probability Ranking Principle, is that the QPRP implicitly captures the dependencies between documents by means of quantum interference". Subsequently, the optimal ranking of documents is not based solely on documents' probability of relevance but also on the interference with the previously ranked documents. Our research shows that the application of quantum theory to problems within information retrieval can lead to consistently better retrieval effectiveness, while still being simple, elegant and tractable.
Resumo:
Unsaturated water flow in soil is commonly modelled using Richards’ equation, which requires the hydraulic properties of the soil (e.g., porosity, hydraulic conductivity, etc.) to be characterised. Naturally occurring soils, however, are heterogeneous in nature, that is, they are composed of a number of interwoven homogeneous soils each with their own set of hydraulic properties. When the length scale of these soil heterogeneities is small, numerical solution of Richards’ equation is computationally impractical due to the immense effort and refinement required to mesh the actual heterogeneous geometry. A classic way forward is to use a macroscopic model, where the heterogeneous medium is replaced with a fictitious homogeneous medium, which attempts to give the average flow behaviour at the macroscopic scale (i.e., at a scale much larger than the scale of the heterogeneities). Using the homogenisation theory, a macroscopic equation can be derived that takes the form of Richards’ equation with effective parameters. A disadvantage of the macroscopic approach, however, is that it fails in cases when the assumption of local equilibrium does not hold. This limitation has seen the introduction of two-scale models that include at each point in the macroscopic domain an additional flow equation at the scale of the heterogeneities (microscopic scale). This report outlines a well-known two-scale model and contributes to the literature a number of important advances in its numerical implementation. These include the use of an unstructured control volume finite element method and image-based meshing techniques, that allow for irregular micro-scale geometries to be treated, and the use of an exponential time integration scheme that permits both scales to be resolved simultaneously in a completely coupled manner. Numerical comparisons against a classical macroscopic model confirm that only the two-scale model correctly captures the important features of the flow for a range of parameter values.
Resumo:
The sublimely cool and mouldering elegance of Melody Gardot’s songs is the inspiration and driving force of DANCE TO GARDOT. This work, in 11 sketches, echoes the sultry moods and yearning sensations of Gardot’s tunes - intimations of suffering and resolve. A classic jazz chanteuse, Gardot’s rich-toned voice along with luscious orchestrations, provide a stylish sonic expression of the place were music meets life.
Resumo:
Traction force microscopy (TFM) is commonly used to estimate cells’ traction forces from the deformation that they cause on their substrate. The accuracy of TFM highly depends on the computational methods used to measure the deformation of the substrate and estimate the forces, and also on the specifics of the experimental set-up. Computer simulations can be used to evaluate the effect of both the computational methods and the experimental set-up without the need to perform numerous experiments. Here, we present one such TFM simulator that addresses several limitations of the existing ones. As a proof of principle, we recreate a TFM experimental set-up, and apply a classic 2D TFM algorithm to recover the forces. In summary, our simulator provides a valuable tool to study the performance, refine experimentally, and guide the extraction of biological conclusions from TFM experiments.
Resumo:
Restriction fragment length polymorphisms have been used to determine the chromosomal location of the genes encoding the glycine decarboxylase complex (GDC) and serine hydroxymethyltransferase (SHMT) of pea leaf mitochondria. The genes encoding the H subunit of GDC and the genes encoding SHMT both show linkage to the classical group I marker i. In addition, the genes for the P protein of GDC show linkage to the classic group I marker a. The genes for the L and T proteins of GDC are linked to one another and are probably situated on the satellite of chromosome 7. The mRNAs encoding the five polypeptides that make up GDC and SHMT are strongly induced when dark-grown etiolated pea seedlings are placed in the light. Similarly, when mature plants are placed in the dark for 48 h, the levels of both GDC protein and SHMT mRNAs decline dramatically and then are induced strongly when these plants are returned to the light. During both treatments a similar pattern of mRNA induction is observed, with the mRNA encoding the P protein of GDC being the most rapidly induced and the mRNA for the H protein the slowest. Whereas during the greening of etiolated seedlings the polypeptides of GDC and SHMT show patterns of accumulation similar to those of the corresponding mRNAs, very little change in the level of the polypeptides is seen when mature plants are placed in the dark and then re-exposed to the light.
Resumo:
A multiscale approach that bridges the biophysics of the actin molecules at nanoscale and the biomechanics of actin filament at microscale level is developed and used to evaluate the mechanical performances of actin filament bundles. In order to investigate the contractile properties of skeletal muscle which is induced by the protein motor of myosin, a molecular model is proposed in the prediction of the dynamic behaviors of skeletal muscle based on classic sliding filament model. Randomly distributed myosin motors are applied on a 2.2 μm long sarcomere, whose principal components include actin and myosin filaments. It can be found that, the more myosin motors on the sarcomere, the faster the sarcomere contracts. The result demonstrates that the sarcomere shortening speed cannot increase infinitely by the modulation of myosin, thus providing insight into the self-protective properties of skeletal muscles. This molecular filament sliding model provides a theoretical way to evaluate the properties of skeletal muscles, and contributes to the understandings of the molecular mechanisms in the physiological phenomenon of muscular contraction.
Resumo:
Solving indeterminate algebraic equations in integers is a classic topic in the mathematics curricula across grades. At the undergraduate level, the study of solutions of non-linear equations of this kind can be motivated by the use of technology. This article shows how the unity of geometric contextualization and spreadsheet-based amplification of this topic can provide a discovery experience for prospective secondary teachers and information technology students. Such experience can be extended to include a transition from a computationally driven conjecturing to a formal proof based on a number of simple yet useful techniques.
Resumo:
Notwithstanding the problems with identifying audiences (c.f. Hartley, 1987), nor with sampling them (c.f. Turner, 2005), we contend that by using social media, it is at least possible to gain an understanding of the habits of those who chose to engage with content through social media. In this chapter, we will broadly outline the ways in which networks such as Twitter and Facebook can stand as proxies for audiences in a number of scenarios, and enable content creators, networks and researchers to understand the ways in which audiences come into existence, change over time, and engage with content. Beginning with the classic audience – television – we will consider the evolution of metrics from baseline volume metrics to the more sophisticated ‘telemetrics’ that are the focus of our current work. We discuss the evolution of these metrics, from principles developed in the field of ‘sabermetrics’, and highlight their effectiveness as both a predictor and a baseline for producers and networks to measure the success of their social media campaigns. Moving beyond the evaluation of the audiences engagement, we then move to consider the ‘audiences’ themselves. Building on Hartley’s argument that audiences are “imagined” constructs (1987, p. 125), we demonstrate the continual shift of Australian television audiences, from episode to episode and series to series, demonstrating through our map of the Australian Twittersphere (Bruns, Burgess & Highfield, 2014) both the variation amongst those who directly engage with television content, and those who are exposed to it through their social media networks. Finally, by exploring overlaps between sporting events (such as the NRL and AFL Grand Finals), reality TV (such as Big Brother, My Kitchen Rules & Biggest Loser), soaps (e.g. Bold & The Beautiful, Home & Away), and current affairs programming (e.g. Morning Television & A Current Affair), we discuss to what extent it is possible to profile and categorize Australian television audiences. Finally, we move beyond television audiences to consider audiences around social media platforms themselves. Building on our map of the Australian Twittersphere (Bruns, Burgess & Highfield, 2014), and a pool of 5000 active Australian accounts, we discuss the interconnectedness of audiences around particular subjects, and how specific topics spread throughout the Twitter Userbase. Also, by using Twitter as a proxy, we consider the career of a number of popular YouTuber’s, utilizing a method we refer to as Twitter Accession charts (Bruns & Woodford, 2014) to identify the growth curves, and relate them to specific events in the YouTubers career, be that ‘viral’ videos or collaborations, to discuss how audiences form around specific content creators.
Resumo:
An approach is proposed and applied to five industries to prove how phenomenology can be valuable in rethinking consumer markets (Popp & Holt, 2013). The purpose of this essay is to highlight the potential implications that 'phenomenological thinking' brings for competitiveness and innovation (Sanders, 1982), hence helping managers being more innovative in their strategic marketing decisions (i.e. market creation, positioning, branding). Phenomenology is in fact a way of thinking − besides and before being a qualitative research procedure − a very practical exercise that strategic managers can master and apply in the same successful way as other scientists have already done in their fields of study (e.g. sociology, psychology, psychiatry, and anthropology). Two fundamental considerations justify this research: a lack of distinctiveness among firms due to high levels of competition and consumers no longer knowing what they want (i.e. no more needs). The authors will show how the classical mental framework generally used to study markets by practitioners appears on the one hand to be established and systematic in the life of a company, while on the other is no longer adequate to meet the needs of innovation required to survive. To the classic principles of objectivity, generality, and psycho-sociology the authors counterpose the imaginary, eidetic-phenomenological reduction, and an existential perspective. From a theoretical point of view, this paper introduces a set of functioning rules applicable to achieve innovation in any market and useful to identify cultural practices inherent in the act of consumption.