557 resultados para Critical Sequence


Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the promises of New Labour was that government policy would be grounded in 'evidence based research'. In recent years some academics have come to question whether the government has delivered on this promise. Professors Reece Walters and Tim Hope offer two contributions to this debate, arguing that rather than the 'evidence base', it is political considerations that govern the commissioning, production and dissemination of Home Office research. As the first monograph in our 'Evidence based policy series' Critical thinking about the uses of research carries a thought provoking set of arguments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work offers a critical introduction to sociology for New Zealand students. Written in an accessible narrative style, it seeks to challenge and debunk students' assumptions about key elements of their social worlds, encouraging them to develop a "critical imagination" as a tool to identify broader social themes in personal issues.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Existing secure software development principles tend to focus on coding vulnerabilities, such as buffer or integer overflows, that apply to individual program statements, or issues associated with the run-time environment, such as component isolation. Here we instead consider software security from the perspective of potential information flow through a program’s object-oriented module structure. In particular, we define a set of quantifiable "security metrics" which allow programmers to quickly and easily assess the overall security of a given source code program or object-oriented design. Although measuring quality attributes of object-oriented programs for properties such as maintainability and performance has been well-covered in the literature, metrics which measure the quality of information security have received little attention. Moreover, existing securityrelevant metrics assess a system either at a very high level, i.e., the whole system, or at a fine level of granularity, i.e., with respect to individual statements. These approaches make it hard and expensive to recognise a secure system from an early stage of development. Instead, our security metrics are based on well-established compositional properties of object-oriented programs (i.e., data encapsulation, cohesion, coupling, composition, extensibility, inheritance and design size), combined with data flow analysis principles that trace potential information flow between high- and low-security system variables. We first define a set of metrics to assess the security quality of a given object-oriented system based on its design artifacts, allowing defects to be detected at an early stage of development. We then extend these metrics to produce a second set applicable to object-oriented program source code. The resulting metrics make it easy to compare the relative security of functionallyequivalent system designs or source code programs so that, for instance, the security of two different revisions of the same system can be compared directly. This capability is further used to study the impact of specific refactoring rules on system security more generally, at both the design and code levels. By measuring the relative security of various programs refactored using different rules, we thus provide guidelines for the safe application of refactoring steps to security-critical programs. Finally, to make it easy and efficient to measure a system design or program’s security, we have also developed a stand-alone software tool which automatically analyses and measures the security of UML designs and Java program code. The tool’s capabilities are demonstrated by applying it to a number of security-critical system designs and Java programs. Notably, the validity of the metrics is demonstrated empirically through measurements that confirm our expectation that program security typically improves as bugs are fixed, but worsens as new functionality is added.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The concept of local accumulation time (LAT) was introduced by Berezhkovskii and coworkers in 2010–2011 to give a finite measure of the time required for the transient solution of a reaction–diffusion equation to approach the steady–state solution (Biophys J. 99, L59 (2010); Phys Rev E. 83, 051906 (2011)). Such a measure is referred to as a critical time. Here, we show that LAT is, in fact, identical to the concept of mean action time (MAT) that was first introduced by McNabb in 1991 (IMA J Appl Math. 47, 193 (1991)). Although McNabb’s initial argument was motivated by considering the mean particle lifetime (MPLT) for a linear death process, he applied the ideas to study diffusion. We extend the work of these authors by deriving expressions for the MAT for a general one–dimensional linear advection–diffusion–reaction problem. Using a combination of continuum and discrete approaches, we show that MAT and MPLT are equivalent for certain uniform–to-uniform transitions; these results provide a practical interpretation for MAT, by directly linking the stochastic microscopic processes to a meaningful macroscopic timescale. We find that for more general transitions, the equivalence between MAT and MPLT does not hold. Unlike other critical time definitions, we show that it is possible to evaluate the MAT without solving the underlying partial differential equation (pde). This makes MAT a simple and attractive quantity for practical situations. Finally, our work explores the accuracy of certain approximations derived using the MAT, showing that useful approximations for nonlinear kinetic processes can be obtained, again without treating the governing pde directly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It appears that few of the students holding ‘socially idealistic’ goals upon entering law school actually maintain these upon graduation. The critical legal narrative, which explains and seeks to act upon this shift in the graduate’s ‘legal identity’, posits that these ideals are repressed through power relations that create passive receptacles into which professional ideologies can be deposited, in the interests of those advantaged by the social and legal status quo. Using the work of Michel Foucault, this paper unpacks the assumptions underpinning this narrative, particularly its arguments about ideology, power, and the subject. In doing so, it will argue this narrative provides an untenable basis for political action within legal education. By interrogating this narrative, this paper provides a new way of understanding the construction of the legal identity through legal education, and a new basis for political action within law school.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article explores power within legal education scholarship. It suggests that power relations are not effectively reflected on within this scholarship, and it provokes legal educators to consider power more explicitly and effectively. It then outlines in-depth a conceptual and methodological approach based on Michel Foucault’s concept of ‘governmentality’ to assist in such an analysis. By detailing the conceptual moves required in order to research power in legal education more effectively, this article seeks to stimulate new reflection and thought about the practice and scholarship of legal education, and allow for political interventions to become more ethically sensitive and potentially more effective.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Those working in the critical criminology tradition have been centrally concerned with the social construction, variability and contingency of the criminal label. The concern is no less salient to a consideration of critical criminology itself and any history of critical criminology (in Australia or elsewhere) should aim itself to be critical in this sense. The point applies with equal force to both of the terms ‘critical’ and ‘criminology’. The want of a stable theoretical object has meant that criminology itself needs to be seen not as a distinct discipline but as a composite intellectual and governmental hybrid, a field of studies that overlaps and intersects many others (sociology, law, psychology, history, anthropology, social work, media studies and youth studies to name only a few). In consequence, much of the most powerful work on subjects of criminological inquiry is undertaken by scholars who do not necessarily define themselves as criminologists first and foremost, or at all. For reasons that should later become obvious this is even more pronounced in the Australian context. Although we may appear at times to be claiming such work for criminology, our purpose is to recognize its impact on and in critical criminology in Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, a number of phylogenetic methods have been developed for estimating molecular rates and divergence dates under models that relax the molecular clock constraint by allowing rate change throughout the tree. These methods are being used with increasing frequency, but there have been few studies into their accuracy. We tested the accuracy of several relaxed-clock methods (penalized likelihood and Bayesian inference using various models of rate change) using nucleotide sequences simulated on a nine-taxon tree. When the sequences evolved with a constant rate, the methods were able to infer rates accurately, but estimates were more precise when a molecular clock was assumed. When the sequences evolved under a model of autocorrelated rate change, rates were accurately estimated using penalized likelihood and by Bayesian inference using lognormal and exponential models of rate change, while other models did not perform as well. When the sequences evolved under a model of uncorrelated rate change, only Bayesian inference using an exponential rate model performed well. Collectively, the results provide a strong recommendation for using the exponential model of rate change if a conservative approach to divergence time estimation is required. A case study is presented in which we use a simulation-based approach to examine the hypothesis of elevated rates in the Cambrian period, and it is found that these high rate estimates might be an artifact of the rate estimation method. If this bias is present, then the ages of metazoan divergences would be systematically underestimated. The results of this study have implications for studies of molecular rates and divergence dates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The estimation of phylogenetic divergence times from sequence data is an important component of many molecular evolutionary studies. There is now a general appreciation that the procedure of divergence dating is considerably more complex than that initially described in the 1960s by Zuckerkandl and Pauling (1962, 1965). In particular, there has been much critical attention toward the assumption of a global molecular clock, resulting in the development of increasingly sophisticated techniques for inferring divergence times from sequence data. In response to the documentation of widespread departures from clocklike behavior, a variety of local- and relaxed-clock methods have been proposed and implemented. Local-clock methods permit different molecular clocks in different parts of the phylogenetic tree, thereby retaining the advantages of the classical molecular clock while casting off the restrictive assumption of a single, global rate of substitution (Rambaut and Bromham 1998; Yoder and Yang 2000).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ratites are large, flightless birds and include the ostrich, rheas, kiwi, emu, and cassowaries, along with extinct members, such as moa and elephant birds. Previous phylogenetic analyses of complete mitochondrial genome sequences have reinforced the traditional belief that ratites are monophyletic and tinamous are their sister group. However, in these studies ratite monophyly was enforced in the analyses that modeled rate heterogeneity among variable sites. Relaxing this topological constraint results in strong support for the tinamous (which fly) nesting within ratites. Furthermore, upon reducing base compositional bias and partitioning models of sequence evolution among protein codon positions and RNA structures, the tinamou–moa clade grouped with kiwi, emu, and cassowaries to the exclusion of the successively more divergent rheas and ostrich. These relationships are consistent with recent results from a large nuclear data set, whereas our strongly supported finding of a tinamou–moa grouping further resolves palaeognath phylogeny. We infer flight to have been lost among ratites multiple times in temporally close association with the Cretaceous–Tertiary extinction event. This circumvents requirements for transient microcontinents and island chains to explain discordance between ratite phylogeny and patterns of continental breakup. Ostriches may have dispersed to Africa from Eurasia, putting in question the status of ratites as an iconic Gondwanan relict taxon. [Base composition; flightless; Gondwana; mitochondrial genome; Palaeognathae; phylogeny; ratites.]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim The purpose of this study was to examine the relationship between registered nurses’ (RN) job satisfaction and their intention to leave critical care nursing in Saudi Arabia. Background Many studies have identified critical care areas as stressful work environments for nurses and have identified factors contributing to job satisfaction and staff retention. However, very little research has examined these relationships in the Saudi context. Design and Methods This study utilised an exploratory, cross-sectional survey design to examine the relationship between RN job satisfaction and intention to leave at King Abdul-Aziz University Hospital, Saudi Arabia. Respondents completed a self-administered survey including demographic items and validated measures of job satisfaction and intention to leave. A convenience sample of 182 RNs working in critical care areas during the data collection period were included. Results Regression analysis predicting RN intention to leave found that demographic variables including age, parental status and length of ICU experience, and three of the job satisfaction subscales including perceived workload, professional support and pay and prospects for promotion, were significantly associated with the outcome variable. Conclusion This study adds to the existing literature on the relationship between job satisfaction and intention to leave critical care areas among RNs working in Saudi Arabia. These findings point to the need for management and policy interventions targeting nurses’ workloads, professional support and pay and promotion in order to improve nurse retention.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Webb et al. (2009) described a late Pleistocenecoral sample wherein the diagenetic stabilization of original coral aragonite to meteoric calcite was halted more or less mid-way through the process, allowing direct comparison of pre-diagenetic and post-diagenetic microstructure and trace element distributions. Those authors found that the rare earth elements (REEs) were relatively stable during meteoric diagenesis, unlike divalent cations such as Sr,and it was thus concluded that original, in this case marine, REE distributions potentially could be preserved through the meteoric carbonate stabilization process that must have affected many, if not most, ancient limestones. Although this was not the case in the analysed sample, they noted that where such diagenesis took place in laterally transported groundwater, trace elements derived from that groundwater could be incorporated into diagenetic calcite, thus altering the initial REE distribution (Banner et al., 1988). Hence, the paper was concerned with the diagenetic behaviour of REEs in a groundwater-dominated karst system. The comment offered by Johannesson (2011) does not question those research results, but rather, seeks to clarify an interpretation made by Webb et al. (2009) of an earlier paper, Johannesson et al. (2006).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To provide an overview and a critical appraisal of systematic reviews (SRs) of published interventions for the prevention/management of radiation dermatitis. Methods and Materials: We searched Medline, CINAHL, Embase, and the Cochrane Library. We also manually searched through individual reference lists of potentially eligible articles and a number of key journals in the topic area. Two authors screened all potential articles and included eligible SRs. Two authors critically appraised and extracted key findings from the included reviews using AMSTAR (the measurement tool for “assessment of multiple systematic reviews”). Results: Of 1837 potential titles, 6 SRs were included. A number of interventions have been reported to be potentially beneficial for managing radiation dermatitis. Interventions evaluated in these reviews included skin care advice, steroidal/nonsteroidal topical agents, systemic therapies, modes of radiation delivery, and dressings. However, all the included SRs reported that there is insufficient evidence supporting any single effective intervention. The methodological quality of the included studies varied, and methodological shortfalls in these reviews might create biases to the overall results or recommendations for clinical practice. Conclusions: An up-to-date high-quality SR in the prevention/management of radiation dermatitis is needed to guide practice and direction for future research. We recommend that clinicians or guideline developers critically evaluate the information of SRs in their decision making.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a formalism for the analysis of sensitivity of nuclear magnetic resonance pulse sequences to variations of pulse sequence parameters, such as radiofrequency pulses, gradient pulses or evolution delays. The formalism enables the calculation of compact, analytic expressions for the derivatives of the density matrix and the observed signal with respect to the parameters varied. The analysis is based on two constructs computed in the course of modified density-matrix simulations: the error interrogation operators and error commutators. The approach presented is consequently named the Error Commutator Formalism (ECF). It is used to evaluate the sensitivity of the density matrix to parameter variation based on the simulations carried out for the ideal parameters, obviating the need for finite-difference calculations of signal errors. The ECF analysis therefore carries a computational cost comparable to a single density-matrix or product-operator simulation. Its application is illustrated using a number of examples from basic NMR spectroscopy. We show that the strength of the ECF is its ability to provide analytic insights into the propagation of errors through pulse sequences and the behaviour of signal errors under phase cycling. Furthermore, the approach is algorithmic and easily amenable to implementation in the form of a programming code. It is envisaged that it could be incorporated into standard NMR product-operator simulation packages.