61 resultados para Effects-Based Approach to Operations
Resumo:
This special issue presents an excellent opportunity to study applied epistemology in public policy. This is an important task because the arena of public policy is the social domain in which macro conditions for ‘knowledge work’ and ‘knowledge industries’ are defined and created. We argue that knowledge-related public policy has become overly concerned with creating the politico-economic parameters for the commodification of knowledge. Our policy scope is broader than that of Fuller (1988), who emphasizes the need for a social epistemology of science policy. We extend our focus to a range of policy documents that include communications, science, education and innovation policy (collectively called knowledge-related public policy in acknowledgement of the fact that there is no defined policy silo called ‘knowledge policy’), all of which are central to policy concerned with the ‘knowledge economy’ (Rooney and Mandeville, 1998). However, what we will show here is that, as Fuller (1995) argues, ‘knowledge societies’ are not industrial societies permeated by knowledge, but that knowledge societies are permeated by industrial values. Our analysis is informed by an autopoietic perspective. Methodologically, we approach it from a sociolinguistic position that acknowledges the centrality of language to human societies (Graham, 2000). Here, what we call ‘knowledge’ is posited as a social and cognitive relationship between persons operating on and within multiple social and non-social (or, crudely, ‘physical’) environments. Moreover, knowing, we argue, is a sociolinguistically constituted process. Further, we emphasize that the evaluative dimension of language is most salient for analysing contemporary policy discourses about the commercialization of epistemology (Graham, in press). Finally, we provide a discourse analysis of a sample of exemplary texts drawn from a 1.3 million-word corpus of knowledge-related public policy documents that we compiled from local, state, national and supranational legislatures throughout the industrialized world. Our analysis exemplifies a propensity in policy for resorting to technocratic, instrumentalist and anti-intellectual views of knowledge in policy. We argue that what underpins these patterns is a commodity-based conceptualization of knowledge, which is underpinned by an axiology of narrowly economic imperatives at odds with the very nature of knowledge. The commodity view of knowledge, therefore, is flawed in its ignorance of the social systemic properties of ��knowing’.
Resumo:
In this study, we have compared the effector functions and fate of a number of human CTL clones in vitro or ex vivo following contact with variant peptides presented either on the cell surface or in a soluble multimeric format. In the presence of CD8 coreceptor binding, there is a good correlation between TCR signaling, killing of the targets, and Fast-mediated CTL apoptosis. Blocking CD8 binding using (alpha3 domain mutants of MHC class I results in much reduced signaling and reduced killing of the targets. Surprisingly, however, Fast expression is induced to a similar degree on these CTLs, and apoptosis of CTL is unaffected. The ability to divorce these events may allow the deletion of antigen-specific and pathological CTL populations without the deleterious effects induced by full CTL activation.
Resumo:
Clinical trials showing the benefits of reducing the effects of TNF-alpha in rheumatoid arthritis have highlighted the key role of the cytokine TNF-alpha in this inflammatory condition. A new approach to reducing the effects of TNF-alpha is to decrease its synthesis by inhibiting TNF-alpha converting enzyme with GW3333. In rat models of arthritis, GW3333 has some beneficial effects. Further longer-term studies of GW3333 in animal models are required to determine whether its benefit is maintained. TACE inhibition may represent a new approach to treating inflammation.
Resumo:
In microarray studies, the application of clustering techniques is often used to derive meaningful insights into the data. In the past, hierarchical methods have been the primary clustering tool employed to perform this task. The hierarchical algorithms have been mainly applied heuristically to these cluster analysis problems. Further, a major limitation of these methods is their inability to determine the number of clusters. Thus there is a need for a model-based approach to these. clustering problems. To this end, McLachlan et al. [7] developed a mixture model-based algorithm (EMMIX-GENE) for the clustering of tissue samples. To further investigate the EMMIX-GENE procedure as a model-based -approach, we present a case study involving the application of EMMIX-GENE to the breast cancer data as studied recently in van 't Veer et al. [10]. Our analysis considers the problem of clustering the tissue samples on the basis of the genes which is a non-standard problem because the number of genes greatly exceed the number of tissue samples. We demonstrate how EMMIX-GENE can be useful in reducing the initial set of genes down to a more computationally manageable size. The results from this analysis also emphasise the difficulty associated with the task of separating two tissue groups on the basis of a particular subset of genes. These results also shed light on why supervised methods have such a high misallocation error rate for the breast cancer data.
Resumo:
Subcycling, or the use of different timesteps at different nodes, can be an effective way of improving the computational efficiency of explicit transient dynamic structural solutions. The method that has been most widely adopted uses a nodal partition. extending the central difference method, in which small timestep updates are performed interpolating on the displacement at neighbouring large timestep nodes. This approach leads to narrow bands of unstable timesteps or statistical stability. It also can be in error due to lack of momentum conservation on the timestep interface. The author has previously proposed energy conserving algorithms that avoid the first problem of statistical stability. However, these sacrifice accuracy to achieve stability. An approach to conserve momentum on an element interface by adding partial velocities is considered here. Applied to extend the central difference method. this approach is simple. and has accuracy advantages. The method can be programmed by summing impulses of internal forces, evaluated using local element timesteps, in order to predict a velocity change at a node. However, it is still only statistically stable, so an adaptive timestep size is needed to monitor accuracy and to be adjusted if necessary. By replacing the central difference method with the explicit generalized alpha method. it is possible to gain stability by dissipating the high frequency response that leads to stability problems. However. coding the algorithm is less elegant, as the response depends on previous partial accelerations. Extension to implicit integration, is shown to be impractical due to the neglect of remote effects of internal forces acting across a timestep interface. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
In this paper we investigate the concepts of 'face' and 'politeness'. We introduce a metalanguage which we believe is a framework for simplifying the analysis of 'face' and 'politeness'. This metalanguage is based on the observation that both 'face' and 'politeness' involve external evaluations of people. This common element is represented in the metalanguage as B what A shows A thinks of B and what B thinks A thinks of B. The implications of the metalanguage for the analysis of Chinese mian and lion ('face') and English face are then discussed. This is followed by an analysis of examples of politeness in English and teineisa ('politeness') in Japanese. We conclude that the metalanguage may be further developed for use in comparisons of 'face' and 'politeness' across cultures. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
In this paper, a stress and coping perspective is used to outline the processes that determine employee adaptation to organisational change. A theoretical framework that simultaneously considers the effects of event characteristics, situational appraisals, coping strategies, and coping resources is reviewed, Three empirical investigations of organisational change that have tested various components of the model are then presented. In the first study, there was evidence linking event characteristics, situational appraisals, coping strategies and coping resources to levels of employee adjustment in a sample of pilots employed in a newly merged airline company. In a more focused test of the model with a sample of employees experiencing a restructuring process in their Organisation it was found that the provision of change-related information enhanced levels of efficacy to deal with the change process which, in turn, predicted psychological wellbeing, client engagement, and job satisfaction. In a study of managers affected by a new remuneration scheme, there was evidence to suggest that managers who received change-specific information and opportunities to participate in the change process reported higher levels of change readiness. Managers who reported higher levels of readiness for change also reported higher levels of psychological wellbeing and job satisfaction. These studies highlight ways in which managers and change agents can help employees to cope during times of organisational change.
Resumo:
Commonly recommended plant sources of provitamin A, such as dark green leafy vegetables, are not acceptable in many population groups. The objective of this study was to identify other indigenous foods that may be effectively promoted to alleviate vitamin A deficiency (VAD) and to gather information relevant to identification, production, acquisition, and consumption of foods relevant to a food-based VAD prevention strategy in the Federated States of Micronesia. An ethnographic study on edible pandanus cultivars, involving key informant interviews and observation was carried out. Analyses revealed a great range in carotenoid content. Several orange-coloured pandanus cultivars, all highly acceptable, contained high levels of carotenoid, almost meeting daily requirements in usual consumption patterns, whereas light yellow-coloured cultivars contained low levels. Availability has decreased substantially in recent years due to increased consumption of imported foods and general neglect of indigenous foods. High-carotenoid pandanus should be promoted for general enjoyment and health benefits.
Resumo:
We consider the problem of assessing the number of clusters in a limited number of tissue samples containing gene expressions for possibly several thousands of genes. It is proposed to use a normal mixture model-based approach to the clustering of the tissue samples. One advantage of this approach is that the question on the number of clusters in the data can be formulated in terms of a test on the smallest number of components in the mixture model compatible with the data. This test can be carried out on the basis of the likelihood ratio test statistic, using resampling to assess its null distribution. The effectiveness of this approach is demonstrated on simulated data and on some microarray datasets, as considered previously in the bioinformatics literature. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
Commercial explosives behave non-ideally in rock blasting. A direct and convenient measure of non-ideality is the detonation velocity. In this study, an alternative model fitted to experimental unconfined detonation velocity data is proposed and the effect of confinement on the detonation velocity is modelled. Unconfined data of several explosives showing various levels of nonideality were successfully modelled. The effect of confinement on detonation velocity was modelled empirically based on field detonation velocity measurements. Confined detonation velocity is a function of the ideal detonation velocity, unconfined detonation velocity at a given blasthole diameter and rock stiffness. For a given explosive and charge diameter, as confinement increases detonation velocity increases. The confinement model is implemented in a simple engineering based non-ideal detonation model. A number of simulations are carried out and analysed to predict the explosive performance parameters for the adopted blasting conditions.
Resumo:
In this paper we proposed a composite depth of penetration (DOP) approach to excluding bottom reflectance in mapping water quality parameters from Landsat thematic mapper (TM) data in the shallow coastal zone of Moreton Bay, Queensland, Australia. Three DOPs were calculated from TM1, TM2 and TM3, in conjunction with bathymetric data, at an accuracy ranging from +/-5% to +/-23%. These depths were used to segment the image into four DOP zones. Sixteen in situ water samples were collected concurrently with the recording of the satellite image. These samples were used to establish regression models for total suspended sediment (TSS) concentration and Secchi depth with respect to a particular DOP zone. Containing identical bands and their transformations for both parameters, the models are linear for TSS concentration, logarithmic for Secchi depth. Based on these models, TSS concentration and Secchi depth were mapped from the satellite image in respective DOP zones. Their mapped patterns are consistent with the in situ observed ones. Spatially, overestimation and underestimation of the parameters are restricted to localised areas but related to the absolute value of the parameters. The mapping was accomplished more accurately using multiple DOP zones than using a single zone in shallower areas. The composite DOP approach enables the mapping to be extended to areas as shallow as <3 m. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
Over the past 30 years, research in the area of applied behaviour. analysis has led to a rich knowledge and understanding of the variables that influence human behaviour. This understanding and knowledge has given rise to a range of assessment and intervention techniques that have been applied to individuals with challenging behaviour. Interventions have produced changes in the severity and frequency of behaviours such as self-injury, aggression, and property destruction, card have also led to the acquisition of desired behaviours. While behaviour change has been achieved, families have expressed a desire for positive behaviour support approaches that adopt a family,focus. Research and development of support frameworks that emphasise the interrelatedness of family members, and the child with a disability as part of his or her family, have gained prominence in the family systems literature. The present paper reviews some of the behaviourally based research in this area. Through the use of a case illustration, the authors discuss the links between behavioural support and family-centred support systems for children with developmental disabilities. Theoretical and practical implications are considered and areas for future research are highlighted.
Resumo:
Although low-density lipoprotein (LDL)-cholesterol lowering with the statins reduces the mortality and morbidity associated with coronary artery disease, considerable mortality and morbidity remains. Berberine upregulates the LDL receptor (LDLR) by a mechanism distinct from that of the statins, which involves stabilising the LDLR mRNA. In hamsters fed a high-fat and high-cholesterol diet for 2 weeks, the oral administration of berberine 100 mg/kg for 10 days reduced total serum cholesterol from &SIM; 4.8 to 2.7 mmol/l, and LDL-cholesterol from &SIM; 2.5 to 1.4 mmol/l. In subjects with hypercholesterolaemia, berberine hydrochloride (0.5 g b.i.d. for 3 months) reduced LDL-cholesterol (from 3.2 to 2.4 mmol/l) without any effect on high-density lipoprotein-cholesterol. Berberine also caused a reduction in triglyceride levels from 2.3 to 1.5 mmol/l. As berberine and statins both upregulate LDLR, their lipid-lowering profiles are similar. Thus, this mechanism is unlikely to make berberine an attractive alternative to statins for lipid lowering in most circumstances. However, the other effects of berberine (anti hypertensive, inotropic and class III antiarrhythmic properties) may make it a useful agent in the treatment of cardiovascular disease.
Resumo:
The estimated parameters of output distance functions frequently violate the monotonicity, quasi-convexity and convexity constraints implied by economic theory, leading to estimated elasticities and shadow prices that are incorrectly signed, and ultimately to perverse conclusions concerning the effects of input and output changes on productivity growth and relative efficiency levels. We show how a Bayesian approach can be used to impose these constraints on the parameters of a translog output distance function. Implementing the approach involves the use of a Gibbs sampler with data augmentation. A Metropolis-Hastings algorithm is also used within the Gibbs to simulate observations from truncated pdfs. Our methods are developed for the case where panel data is available and technical inefficiency effects are assumed to be time-invariant. Two models-a fixed effects model and a random effects model-are developed and applied to panel data on 17 European railways. We observe significant changes in estimated elasticities and shadow price ratios when regularity restrictions are imposed. (c) 2004 Elsevier B.V. All rights reserved.
Resumo:
One of the obstacles to improved security of the Internet is ad hoc development of technologies with different design goals and different security goals. This paper proposes reconceptualizing the Internet as a secure distributed system, focusing specifically on the application layer. The notion is to redesign specific functionality, based on principles discovered in research on distributed systems in the decades since the initial development of the Internet. Because of the problems in retrofitting new technology across millions of clients and servers, any options with prospects of success must support backward compatibility. This paper outlines a possible new architecture for internet-based mail which would replace existing protocols by a more secure framework. To maintain backward compatibility, initial implementation could offer a web browser-based front end but the longer-term approach would be to implement the system using appropriate models of replication. (C) 2005 Elsevier Ltd. All rights reserved.