954 resultados para Lineage Specification
Resumo:
The suitability of Role Based Access Control (RBAC) is being challenged in dynamic environments like healthcare. In an RBAC system, a user's legitimate access may be denied if their need has not been anticipated by the security administrator at the time of policy specification. Alternatively, even when the policy is correctly specified an authorised user may accidentally or intentionally misuse the granted permission. The heart of the challenge is the intrinsic unpredictability of users' operational needs as well as their incentives to misuse permissions. In this paper we propose a novel Budget-aware Role Based Access Control (B-RBAC) model that extends RBAC with the explicit notion of budget and cost, where users are assigned a limited budget through which they pay for the cost of permissions they need. We propose a model where the value of resources are explicitly defined and an RBAC policy is used as a reference point to discriminate the price of access permissions, as opposed to representing hard and fast rules for making access decisions. This approach has several desirable properties. It enables users to acquire unassigned permissions if they deem them necessary. However, users misuse capability is always bounded by their allocated budget and is further adjustable through the discrimination of permission prices. Finally, it provides a uniform mechanism for the detection and prevention of misuses.
Resumo:
Recent studies demonstrated endogenous expression level of Sox2, Oct-4 and c-Myc is correlated with the pluripotency and successful induction of induced pluripotent stem cells (iPSCs). Periondontal ligament cells (PDLCs)have multi-lineage diferentiation capability and ability to maintain undifferentiated stage, which makes PDLCs a suitable cell source for tissue repair and regeneration. To elucidate the effect of in vitro culture condition on the stemness potential of PDLCs, we explored the cell growth, proliferation, cell cycle, and the expression of Sox2, Oct-4 and c-Myc in PDLCs from passage 1 to 7 with or without the addition of recombinant human BMP4(rhBMP4). Our results revealed that BMP-4 promoted cell growth and proliferation, arrested PDLCs in S phase of cell cycle and upregulated PI value. It was revealed that without the addition of rhBMP4, the expression of Sox2, Oct-4 and c-Myc in PDLCs only maintained nucleus location until passage 3, then lost nucleus location subsequently. The mRNA expression in PDLCs further confirmed that the level of Sox2 and Oct-4 peaked at passage 3, then decreased afterwards, whereas c-Myc maintained consistently upregulation along passages. after the treatment with rhBMP4, the expression of Sox2, Oct-4 and c-Myc in PDLCs maintained nucleus location even at passage 7 and the mRNA expression of Sox2 and Oct-4 significantly upregulated at passage 5 and 7. These results demonstrated that addition of rhBMP-4 in the culture media could improve the current culture condition for PDLCs to maintain in an undifferentiated stage.
Implementation Guide for Surveillance of Staphylococcus aureus Bacteraemia -- [Consultation Edition]
Resumo:
The Implementation Guide for the Hospital Surveillance of SAB has been produced by the Healthcare Associated Infection (HAI) Technical Working Group of the Australian Commission on Safety and Quality in Health Care (ACSQHC), and endorsed by the HAI Advisory Group. The Technical Working Group is made up of representatives invited from surveillance units and the ACSQHC, who have had input into the preparation of this Guide. The Guide has been developed to ensure consistency in reporting of SAB across public and private hospitals to enable accurate national reporting and benchmarking. It is intended to be used by Australian hospitals and organisations to support the implementation of healthcare associated Staphylococcus aureus bacteraemia(SAB) surveillance using the endorsed case definition1 in the box below and further detail in the Data Set Specification.
Resumo:
There has been a recent surge of interest in cooking skills in a diverse range of fields, such as health, education and public policy. There appears to be an assumption that cooking skills are in decline and that this is having an adverse impact on individual health and well-being, and family wholesomeness. The problematisation of cooking skills is not new, and can be seen in a number of historical developments that have specified particular pedagogies about food and eating. The purpose of this paper is to examine pedagogies on cooking skills and the importance accorded them. The paper draws on Foucault’s work on governmentality. By using examples from the USA, UK and Australia, the paper demonstrates the ways that authoritative discourses on the know how and the know what about food and cooking – called here ‘savoir fare’ – are developed and promulgated. These discourses, and the moral panics in which they are embedded, require individuals to make choices about what to cook and how to cook, and in doing so establish moral pedagogies concerning good and bad cooking. The development of food literacy programmes, which see cooking skills as life skills, further extends the obligations to ‘cook properly’ to wider populations. The emphasis on cooking knowledge and skills has ushered in new forms of government, firstly, through a relationship between expertise and politics which is readily visible through the authority that underpins the need to develop skills in food provisioning and preparation; secondly, through a new pluralisation of ‘social’ technologies which invites a range of private-public interest through, for example, television cooking programmes featuring cooking skills, albeit it set in a particular milieu of entertainment; and lastly, through a new specification of the subject can be seen in the formation of a choosing subject, one which has to problematise food choice in relation to expert advice and guidance. A governmentality focus shows that as discourses develop about what is the correct level of ‘savoir fare’, new discursive subject positions are opened up. Armed with the understanding of what is considered expert-endorsed acceptable food knowledge, subjects judge themselves through self-surveillance. The result is a powerful food and family morality that is both disciplined and disciplinary.
Resumo:
Educators are faced with many challenging questions in designing an effective curriculum. What prerequisite knowledge do students have before commencing a new subject? At what level of mastery? What is the spread of capabilities between bare-passing students vs. the top performing group? How does the intended learning specification compare to student performance at the end of a subject? In this paper we present a conceptual model that helps in answering some of these questions. It has the following main capabilities: capturing the learning specification in terms of syllabus topics and outcomes; capturing mastery levels to model progression; capturing the minimal vs. aspirational learning design; capturing confidence and reliability metrics for each of these mappings; and finally, comparing and reflecting on the learning specification against actual student performance. We present a web-based implementation of the model, and validate it by mapping the final exams from four programming subjects against the ACM/IEEE CS2013 topics and outcomes, using Bloom's Taxonomy as the mastery scale. We then import the itemised exam grades from 632 students across the four subjects and compare the demonstrated student performance against the expected learning for each of these. Key contributions of this work are the validated conceptual model for capturing and comparing expected learning vs. demonstrated performance, and a web-based implementation of this model, which is made freely available online as a community resource.
Resumo:
The well-known difficulties students exhibit when learning to program are often characterised as either difficulties in understanding the problem to be solved or difficulties in devising and coding a computational solution. It would therefore be helpful to understand which of these gives students the greatest trouble. Unit testing is a mainstay of large-scale software development and maintenance. A unit test suite serves not only for acceptance testing, but is also a form of requirements specification, as exemplified by agile programming methodologies in which the tests are developed before the corresponding program code. In order to better understand students’ conceptual difficulties with programming, we conducted a series of experiments in which students were required to write both unit tests and program code for non-trivial problems. Their code and tests were then assessed separately for correctness and ‘coverage’, respectively. The results allowed us to directly compare students’ abilities to characterise a computational problem, as a unit test suite, and develop a corresponding solution, as executable code. Since understanding a problem is a pre-requisite to solving it, we expected students’ unit testing skills to be a strong predictor of their ability to successfully implement the corresponding program. Instead, however, we found that students’testing abilities lag well behind their coding skills.
Resumo:
This note examines the productive efficiency of 62 starting guards during the 2011/12 National Basketball Association (NBA) season. This period coincides with the phenomenal and largely unanticipated performance of New York Knicks’ starting point guard Jeremy Lin and the attendant public and media hype known as Linsanity. We employ a data envelopment analysis (DEA) approach that includes allowance for an undesirable output, here turnovers per game, with the desirable outputs of points, rebounds, assists, steals and blocks per game and an input of minutes per game. The results indicate that depending upon the specification, between 29% and 42% of NBA guards are fully efficient, including Jeremy Lin, with a mean inefficiency of 3.7% and 19.2%. However, while Jeremy Lin is technically efficient, he seldom serves as a benchmark for inefficient players, at least when compared with established players such as Chris Paul and Dwayne Wade. This suggests the uniqueness of Jeremy Lin's productive solution and may explain why his unique style of play, encompassing individual brilliance, unselfish play and team leadership, is of such broad public appeal.
Resumo:
Immigration has played an important role in the historical development of Australia. Thus, it is no surprise that a large body of empirical work has developed, which focuses upon how migrants fare in the land of opportunity. Much of the literature is comparatively recent, i.e. the last ten years or so, encouraged by the advent of public availability of Australian crosssection micro data. Several different aspects of migrant welfare have been addressed, with major emphasis being placed upon earnings and unemployment experience. For recent examples see Haig (1980), Stromback (1984), Chiswick and Miller (1985), Tran-Nam and Nevile (1988) and Beggs and Chapman (1988). The present paper contributes to the literature by providing additional empirical evidence on the native/migrant earnings differential. The data utilised are from the rather neglected Australian Bureau of Statistics, ABS Special Supplementary Survey No.4. 1982, otherwise known as the Family Survey. The paper also examines the importance of distinguishing between the wage and salary sector and the self-employment sector when discussing native/migrant differentials. Separate earnings equations for the two labour market groups are estimated and the native/migrant earnings differential is broken down by employment status. This is a novel application in the Australian context and provides some insight into the earnings of the selfemployed, a group that despite its size (around 20 per cent of the labour force) is frequently ignored by economic research. Most previous empirical research fails to examine the effect of employment status on earnings. Stromback (1984) includes a dummy variable representing self-employment status in an earnings equation estimated over a pooled sample of paid and self-employed workers. The variable is found to be highly significant, which leads Stromback to question the efficacy of including the self-employed in the estimation sample. The suggestion is that part of self-employed earnings represent a return to non-human capital investment, i.e. investments in machinery, buildings etc, the structural determinants of earnings differ significantly from those for paid employees. Tran-Nam and Nevile (1988) deal with differences between paid employees and the selfemployed by deleting the latter from their sample. However, deleting the self-employed from the estimation sample may lead to bias in the OLS estimation method (see Heckman 1979). The desirable properties of OLS are dependent upon estimation on a random sample. Thus, the 'Ran-Nam and Nevile results are likely to suffer from bias unless individuals are randomly allocated between self-employment and paid employment. The current analysis extends Tran-Nam and Nevile (1988) by explicitly treating the choice of paid employment versus self-employment as being endogenously determined. This allows an explicit test for the appropriateness of deleting self-employed workers from the sample. Earnings equations that are corrected for sample selection are estimated for both natives and migrants in the paid employee sector. The Heckman (1979) two-step estimator is employed. The paper is divided into five major sections. The next section presents the econometric model incorporating the specification of the earnings generating process together with an explicit model determining an individual's employment status. In Section 111 the data are described. Section IV draws together the main econometric results of the paper. First, the probit estimates of the labour market status equation are documented. This is followed by presentation and discussion of the Heckman two-stage estimates of the earnings specification for both native and migrant Australians. Separate earnings equations are estimated for paid employees and the self-employed. Section V documents estimates of the nativelmigrant earnings differential for both categories of employees. To aid comparison with earlier work, the Oaxaca decomposition of the earnings differential for paid-employees is carried out for both the simple OLS regression results as well as the parameter estimates corrected for sample selection effects. These differentials are interpreted and compared with previous Australian findings. A short section concludes the paper.
Resumo:
The success of contemporary organizations depends on their ability to make appropriate decisions. Making appropriate decisions is inevitably bound to the availability and provision of relevant information. Information systems should be able to provide information in an efficient way. Thus, within information systems development a detailed analysis of information supply and information demands has to prevail. Based on Syperski’s information set and subset-model we will give an epistemological foundation of information modeling in general and show, why conceptual modeling in particular is capable of specifying effective and efficient information systems. Furthermore, we derive conceptual modeling requirements based on our findings. A short example illustrates the usefulness of a conceptual data modeling technique for the specification of information systems.
Resumo:
Supply chain management and customer relationship management are concepts for optimizing the provision of goods to customers. Information sharing and information estimation are key tools used to implement these two concepts. The reduction of delivery times and stock levels can be seen as the main managerial objectives of an integrative supply chain and customer relationship management. To achieve this objective, business processes need to be integrated along the entire supply chain including the end consumer. Information systems form the backbone of any business process integration. The relevant information system architectures are generally well-understood, but the conceptual specification of information systems for business process integration from a management perspective, remains an open methodological problem. To address this problem, we will show how customer relationship management and supply chain management information can be integrated at the conceptual level in order to provide supply chain managers with relevant information. We will further outline how the conceptual management perspective of business process integration can be supported by deriving specifications for enabling information system from business objectives.
Resumo:
Data warehouse projects, today, are in an ambivalent situation. On the one hand, data warehouses are critical for a company’s success and various methodological and technological tools are sophisticatedly developed to implement them. On the other hand, a significant amount of data warehouse projects fails due to non-technical reasons such as insufficient management support or in-corporative employees. But management support and user participation can be increased dramatically with specification methods that are understandable to these user groups. This paper aims at overcoming possible non-technical failure reasons by introducing a user-adequate specification approach within the field of management information systems.
Resumo:
Teachers of construction economics and estimating have for a long time recognised that there is more to construction pricing than detailed calculation of costs (to the contractor). We always get to the point where we have to say "of course, experience or familiarity of the market is very important and this needs judgement, intuition, etc". Quite how important is the matter in construction pricing is not known and we tend to trivialise its effect. If judgement of the market has a minimal effect, little harm would be done, but if it is really important then some quite serious consequences arise which go well beyond the teaching environment. Major areas of concern for the quantity surveyor are in cost modelling and cost planning - neither of which pay any significant attention to the market effect. There are currently two schools of thought about the market effect issue. The first school is prepared to ignore possible effects until more is known. This may be called the pragmatic school. The second school exists solely to criticise the first school. We will call this the antagonistic school. Neither the pragmatic nor the antagonistic schools seem to be particularly keen to resolve the issue one way or the other. The founder and leader of the antagonistic school is Brian Fine whose paper in 1974 is still the basic text on the subject, and in which he coined the term 'socially acceptable' price to describe what we now recognise as the market effect. Mr Fine's argument was then, and is since, that the uncertainty surrounding the contractors' costing and cost estimating process is such that the uncertainty surrounding the contractors' cost that it logically leads to a market-orientated pricing approach. Very little factual evidence, however, seems to be available to support these arguments in any conclusive manner. A further, and more important point for the pragmatic school, is that, even if the market effect is as important as Mr Fine believes, there are no indications of how it can be measured, evaluated or predicted. Since 1974 evidence has been accumulating which tends to reinforce the antagonists' view. A review of the literature covering both contractors' and designers' estimates found many references to the use of value judgements in construction pricing (Ashworth & Skitmore, 1985), which supports the antagonistic view in implying the existence of uncertainty overload. The most convincing evidence emerged quite by accident in some research we recently completed with practicing quantity surveyors in estimating accuracy (Skitmore, 1985). In addition to demonstrating that individual quantity surveyors and certain types of buildings had significant effect on estimating accuracy, one surprise result was that only a very small amount of information was used by the most expert surveyors for relatively very accurate estimates. Only the type and size of building, it seemed, was really relevant in determining accuracy. More detailed information about the buildings' specification, and even a sight to the drawings, did not significantly improve their accuracy level. This seemed to offer clear evidence that the constructional aspects of the project were largely irrelevant and that the expert surveyors were somehow tuning in to the market price of the building. The obvious next step is to feed our expert surveyors with more relevant 'market' information in order to assess its effect. The problem with this is that our experts do not seem able to verbalise their requirements in this respect - a common occurrence in research of this nature. The lack of research into the nature of market effects on prices also means the literature provides little of benefit. Hence the need for this study. It was felt that a clearer picture of the nature of construction markets would be obtained in an environment where free enterprise was a truly ideological force. For this reason, the United States of America was chosen for the next stage of our investigations. Several people were interviewed in an informal and unstructured manner to elicit their views on the action of market forces on construction prices. Although a small number of people were involved, they were thought to be reasonably representative of knowledge in construction pricing. They were also very well able to articulate their views. Our initial reaction to the interviews was that our USA subjects held very close views to those held in the UK. However, detailed analysis revealed the existence of remarkably clear and consistent insights that would not have been obtained in the UK. Further evidence was also obtained from literature relating to the subject and some of the interviewees very kindly expanded on their views in later postal correspondence. We have now analysed all the evidence received and, although a great deal is of an anecdotal nature, we feel that our findings enable at least the basic nature of the subject to be understood and that the factors and their interrelationships can now be examined more formally in relation to construction price levels. I must express my gratitude to the Royal Institution of Chartered Surveyors' Educational Trust and the University of Salford's Department of Civil Engineering for collectively funding this study. My sincere thanks also go to our American participants who freely gave their time and valuable knowledge to us in our enquiries. Finally, I must record my thanks to Tim and Anne for their remarkable ability to produce an intelligible typescript from my unintelligible writing.
Resumo:
The marsupial genus Macropus includes three subgenera, the familiar large grazing kangaroos and wallaroos of M. (Macropus) and M. (Osphranter), as well as the smaller mixed grazing/browsing wallabies of M. (Notamacropus). A recent study of five concatenated nuclear genes recommended subsuming the predominantly browsing Wallabia bicolor (swamp wallaby) into Macropus. To further examine this proposal we sequenced partial mitochondrial genomes for kangaroos and wallabies. These sequences strongly favour the morphological placement of W. bicolor as sister to Macropus, although place M. irma (black-gloved wallaby) within M. (Osphranter) rather than as expected, with M. (Notamacropus). Species tree estimation from separately analysed mitochondrial and nuclear genes favours retaining Macropus and Wallabia as separate genera. A simulation study finds that incomplete lineage sorting among nuclear genes is a plausible explanation for incongruence with the mitochondrial placement of W. bicolor, while mitochondrial introgression from a wallaroo into M. irma is the deepest such event identified in marsupials. Similar such coalescent simulations for interpreting gene tree conflicts will increase in both relevance and statistical power as species-level phylogenetics enters the genomic age. Ecological considerations in turn, hint at a role for selection in accelerating the fixation of introgressed or incompletely sorted loci. More generally the inclusion of the mitochondrial sequences substantially enhanced phylogenetic resolution. However, we caution that the evolutionary dynamics that enhance mitochondria as speciation indicators in the presence of incomplete lineage sorting may also render them especially susceptible to introgression.
Four new avian mitochondrial genomes help get to basic evolutionary questions in the late cretaceous
Resumo:
Good phylogenetic trees are required to test hypotheses about evolutionary processes. We report four new avian mitochondrial genomes, which together with an improved method of phylogenetic analysis for vertebrate mt genomes give results for three questions in avian evolution. The new mt genomes are: magpie goose (Anseranas semipalmata), an owl (morepork, Ninox novaeseelandiae); a basal passerine (rifleman, or New Zealand wren, Acanthisitta chloris); and a parrot (kakapo or owl-parrot, Strigops habroptilus). The magpie goose provides an important new calibration point for avian evolution because the well-studied Presbyornis fossils are on the lineage to ducks and geese, after the separation of the magpie goose. We find, as with other animal mitochondrial genomes, that RY-coding is helpful in adjusting for biases between pyrimidines and between purines. When RY-coding is used at third positions of the codon, the root occurs between paleognath and neognath birds (as expected from morphological and nuclear data). In addition, passerines form a relatively old group in Neoaves, and many modern avian lineages diverged during the Cretaceous. Although many aspects of the avian tree are stable, additional taxon sampling is required.
Resumo:
Periodontitis results from the destructive inflammatory reaction of the host elicited by a bacterial biofilm adhering to the tooth surface and if left untreated, may lead to the loss of the teeth and the surrounding tissues, including the alveolar bone. Cementum is a specialized calcified tissue covering the tooth root and an essential part of the periodontium which enables the attachment of the periodontal ligament to the root and the surrounding alveolar bone. Periodontal ligament cells (PDLCs) represent a promising cell source for periodontal tissue engineering. Since cementogenesis is the critical event for the regeneration of periodontal tissues, this study examined whether inorganic stimuli derived from bioactive bredigite (Ca7MgSi4O16) bioceramics could stimulate the proliferation and cementogenic differentiation of PDLCs, and further investigated the involvement of the Wnt/β-catenin signalling pathway during this process via analysing gene/protein expression of PDLCs which interacted with bredigite extracts. Our results showed that the ionic products from bredigite powder extracts led to significantly enhanced proliferation and cementogenic differentiation, including mineralization–nodule formation, ALP activity and a series of bone/cementum-related gene/protein expression (ALP, OPN, OCN, BSP, CAP and CEMP1) of PDLCs in a concentration dependent manner. Furthermore, the addition of cardamonin, a Wnt/β-catenin signalling inhibitor, reduced the pro-cementogenesis effect of the bredigite extracts, indicating the involvement of the Wnt/β-catenin signalling pathway in the cementogenesis of PDLCs induced by bredigite extracts. The present study suggests that an entirely inorganic stimulus with a specific composition of bredigite bioceramics possesses the capacity to trigger the activation of the Wnt/β-catenin signalling pathway, leading to stimulated differentiation of PDLCs toward a cementogenic lineage. The results indicate the therapeutic potential of bredigite ceramics in periodontal tissue engineering application.