922 resultados para Number development
Resumo:
The technique of growing human leukaemic cells in diffusion chambers was developed to enable chemicals to be assessed for their ability to induce terminal differentiation. HL-60 promyelocytic leukaemia cell growth, in a lucite chamber with a Millipore filter, was optimised by use of a lateral incision site. Chambers were constructed using 0.45um filters and contained 150ul of serum-free HL-60 cells at a density of 1x106 cells/ml. The chambers were implanted into CBA/Ca mice and spontaneous terminal differentiation of the cells to granulocytes was prevented by the use of serum-free medium. Under these conditions there was an initial growth lag of 72 hours and a logarithmic phase of growth for 96 hours; the cell number reached a plateau after 168 hours of culture in vivo. The amount of drug in the plasma of the animal and in chambers that had been implanted for 5 days, was determined after a single ip injection of equitoxic doses of N-methylformamide, N-ethylformamide, tetramethylurea, N-dibutylformamide, N-tetramethylbutylformamide and hexamethylenebisacetamide. Concentrations of both TMU and HMBA were obtained in the plasma and in the chamber which were pharmacologically effective for the induction of differentiation of HL-60 cells in vitro, that is 12mM TMU and 5mM HMBA. A 4 day regime of treatment of animals implanted with chambers demonstrated that TMU and HMBA induced terminal differentiation of 50% and 35%, respectively, of the implanted HL-60 cells to granulocyte-like cells, assessed by measurement of functional and biochemical markers of maturity. None of the other agents attained concentrations in the plasma that were pharmacologically effective for the induction of differentiation of the cells in vitro and were unable to induce the terminal differentiation of the cells in vivo.
Resumo:
The soil-plant-moisture subsystem is an important component of the hydrological cycle. Over the last 20 or so years a number of computer models of varying complexity have represented this subsystem with differing degrees of success. The aim of this present work has been to improve and extend an existing model. The new model is less site specific thus allowing for the simulation of a wide range of soil types and profiles. Several processes, not included in the original model, are simulated by the inclusion of new algorithms, including: macropore flow; hysteresis and plant growth. Changes have also been made to the infiltration, water uptake and water flow algorithms. Using field data from various sources, regression equations have been derived which relate parameters in the suction-conductivity-moisture content relationships to easily measured soil properties such as particle-size distribution data. Independent tests have been performed on laboratory data produced by Hedges (1989). The parameters found by regression for the suction relationships were then used in equations describing the infiltration and macropore processes. An extensive literature review produced a new model for calculating plant growth from actual transpiration, which was itself partly determined by the root densities and leaf area indices derived by the plant growth model. The new infiltration model uses intensity/duration curves to disaggregate daily rainfall inputs into hourly amounts. The final model has been calibrated and tested against field data, and its performance compared to that of the original model. Simulations have also been carried out to investigate the effects of various parameters on infiltration, macropore flow, actual transpiration and plant growth. Qualitatively comparisons have been made between these results and data given in the literature.
Resumo:
Protein oxidation is thought to contribute to a number of inflammatory diseases, hence the development of sensitive and specific analytical techniques to detect oxidative PTMs (oxPTMs) in biological samples is highly desirable. Precursor ion scanning for fragment ions of oxidized amino acid residues was investigated as a label-free MS approach to mapping specific oxPTMs in a complex mixture of proteins. Using HOCl-oxidized lysozyme as a model system, it was found that the immonium ions of oxidized tyrosine and tryptophan formed in MS(2) analysis could not be used as diagnostic ions, owing to the occurrence of isobaric fragment ions from unmodified peptides. Using a double quadrupole linear ion trap mass spectrometer, precursor ion scanning was combined with detection of MS(3) fragment ions from the immonium ions and collisionally-activated decomposition peptide sequencing to achieve selectivity for the oxPTMs. For chlorotyrosine, the immonium ion at 170.1 m/z fragmented to yield diagnostic ions at 153.1, 134.1, and 125.1 m/z, and the hydroxytyrosine immonium ion at 152.1 m/z gave diagnostic ions at 135.1 and 107.1 m/z. Selective MS(3) fragment ions were also identified for 2-hydroxytryptophan and 5-hydroxytryptophan. The method was used successfully to map these oxPTMs in a mixture of nine proteins that had been treated with HOCl, thereby demonstrating its potential for application to complex biological samples.
Resumo:
This thesis presents a study of the sources of new product ideas and the development of new product proposals in an organisation in the UK Computer Industry. The thesis extends the work of von Hippel by showing how the phenomenon which he describes as "the Customer Active Paradigm for new product idea generation" can be observed to operate in this Industry. Furthermore, this thesis contrasts his Customer Active Paradigm with the more usually encountered Manufacturer Active Paradigm. In a second area, the thesis draws a number of conclusions relating to methods of market research, confirming existing observations and demonstrating the suitability of flexible interview strategies in certain circumstances. The thesis goes on to demonstrate the importance of free information flow within the organisation, making it more likely that sought and unsought opportunities can be exploited. It is shown that formal information flows and documents are a necessary but not sufficient means of influencing the formation of the organisation's dominant ideas on new product areas. The findings also link the work of Tushman and Katz on the role of "Gatekeepers" with the work of von Hippel by showing that the role of gatekeeper is particularly appropriate and useful to an organisation changing from Customer Active to Manufacturer Active methods of idea generation. Finally, the thesis provides conclusions relating to the exploitation of specific new product opportunities facing the sponsoring organisation.
Resumo:
The present scarcity of operational knowledge-based systems (KBS) has been attributed, in part, to an inadequate consideration shown to user interface design during development. From a human factors perspective the problem has stemmed from an overall lack of user-centred design principles. Consequently the integration of human factors principles and techniques is seen as a necessary and important precursor to ensuring the implementation of KBS which are useful to, and usable by, the end-users for whom they are intended. Focussing upon KBS work taking place within commercial and industrial environments, this research set out to assess both the extent to which human factors support was presently being utilised within development, and the future path for human factors integration. The assessment consisted of interviews conducted with a number of commercial and industrial organisations involved in KBS development; and a set of three detailed case studies of individual KBS projects. Two of the studies were carried out within a collaborative Alvey project, involving the Interdisciplinary Higher Degrees Scheme (IHD) at the University of Aston in Birmingham, BIS Applied Systems Ltd (BIS), and the British Steel Corporation. This project, which had provided the initial basis and funding for the research, was concerned with the application of KBS to the design of commercial data processing (DP) systems. The third study stemmed from involvement on a KBS project being carried out by the Technology Division of the Trustees Saving Bank Group plc. The preliminary research highlighted poor human factors integration. In particular, there was a lack of early consideration of end-user requirements definition and user-centred evaluation. Instead concentration was given to the construction of the knowledge base and prototype evaluation with the expert(s). In response to this identified problem, a set of methods was developed that was aimed at encouraging developers to consider user interface requirements early on in a project. These methods were then applied in the two further projects, and their uptake within the overall development process was monitored. Experience from the two studies demonstrated that early consideration of user interface requirements was both feasible, and instructive for guiding future development work. In particular, it was shown a user interface prototype could be used as a basis for capturing requirements at the functional (task) level, and at the interface dialogue level. Extrapolating from this experience, a KBS life-cycle model is proposed which incorporates user interface design (and within that, user evaluation) as a largely parallel, rather than subsequent, activity to knowledge base construction. Further to this, there is a discussion of several key elements which can be seen as inhibiting the integration of human factors within KBS development. These elements stem from characteristics of present KBS development practice; from constraints within the commercial and industrial development environments; and from the state of existing human factors support.
Resumo:
The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.
Resumo:
Random number generation is a central component of modern information technology, with crucial applications in ensuring communications and information security. The development of new physical mechanisms suitable to directly generate random bit sequences is thus a subject of intense current research, with particular interest in alloptical techniques suitable for the generation of data sequences with high bit rate. One such promising technique that has received much recent attention is the chaotic semiconductor laser systems producing high quality random output as a result of the intrinsic nonlinear dynamics of its architecture [1]. Here we propose a novel complementary concept of all-optical technique that might dramatically increase the generation rate of random bits by using simultaneously multiple spectral channels with uncorrelated signals - somewhat similar to use of wave-division-multiplexing in communications. We propose to exploit the intrinsic nonlinear dynamics of extreme spectral broadening and supercontinuum (SC) generation in optical fibre, a process known to be often associated with non-deterministic fluctuations [2]. In this paper, we report proof-of concept results indicating that the fluctuations in highly nonlinear fibre SC generation can potentially be used for random number generation.
Resumo:
The current rate of global biodiversity loss led many governments to sign the international agreement ‘Halting Biodiversity Loss by 2010 and beyond’ in 2001. The UK government was one of these and has a number of methods to tackle this, such as: commissioning specific technical guidance and supporting the UK Biodiversity Acton Plan (BAP) targets. However, by far the most effective influence the government has upon current biodiversity levels is through the town planning system. This is due to the control it has over all phases of a new development scheme’s lifecycle.There is an increasing myriad of regulations, policies and legislation, which deal with biodiversity protection and enhancement across the hierarchical spectrum: from the global and European level, down to regional and local levels. With these drivers in place, coupled with the promotion of benefits and incentives, increasing biodiversity value ought to be an achievable goal on most, if not all development sites. However, in the professional world, this is not the case due to a number of obstructions. Many of these tend to be ‘process’ barriers, which are particularly prevalent with ‘urban’ and ‘major’ development schemes, and is where the focus of this research paper lies.The paper summarises and discusses the results of a questionnaire survey, regarding obstacles to maximising biodiversity enhancements on major urban development schemes. The questionnaire was completed by Local Government Ecologists in England. The paper additionally refers to insights from previous action research, specialist interviews, and case studies, to reveal the key process obstacles.Solutions to these obstacles are then alluded to and recommendations are made within the discussion.
Resumo:
A great number of strategy tools are being taught in strategic management modules. These tools are available to managers for use in facilitating strategic decision making and enhancing the strategy development process in their organisations. A number of studies have been published examining which are the most popular tools; however there is little empirical evidence on how their utilisation influences the strategy process. This paper is based on a large scale international survey on the strategy development process, and seeks to examine the impact of a particular strategy tool, the Balanced Scorecard (BSC), upon the strategy process. Recently, it has been suggested that as a strategy tool, the BSC can influence all elements of the strategy process. The results of this study indicate that although there are significant differences in some elements of the strategy process between the organisations that have implemented the BSC and those that have not, the impact is not comprehensive.
Resumo:
Mood stabilising drugs such as lithium (LiCl) and valproic acid (VPA) are the first line agents for treating conditions such as Bipolar disorder and Epilepsy. However, these drugs have potential developmental effects that are not fully understood. This study explores the use of a simple human neurosphere-based in vitro model to characterise the pharmacological and toxicological effects of LiCl and VPA using gene expression changes linked to phenotypic alterations in cells. Treatment with VPA and LiCl resulted in the differential expression of 331 and 164 genes respectively. In the subset of VPA targeted genes, 114 were downregulated whilst 217 genes were upregulated. In the subset of LiCl targeted genes, 73 were downregulated and 91 were upregulated. Gene ontology (GO) term enrichment analysis was used to highlight the most relevant GO terms associated with a given gene list following toxin exposure. In addition, in order to phenotypically anchor the gene expression data, changes in the heterogeneity of cell subtype populations and cell cycle phase were monitored using flow cytometry. Whilst LiCl exposure did not significantly alter the proportion of cells expressing markers for stem cells/undifferentiated cells (Oct4, SSEA4), neurons (Neurofilament M), astrocytes (GFAP) or cell cycle phase, the drug caused a 1.4-fold increase in total cell number. In contrast, exposure to VPA resulted in significant upregulation of Oct4, SSEA, Neurofilament M and GFAP with significant decreases in both G2/M phase cells and cell number. This neurosphere model might provide the basis of a human-based cellular approach for the regulatory exploration of developmental impact of potential toxic chemicals.
Resumo:
The contribution that inward foreign direct investment (FDI) makes to development has been examined in a number of contexts including the relationship between inward FDI and new firm formation; growth; innovation, exports and competitiveness. However, no debate has proved so contentious, or so long lasting as that concerning the extent to which inward FDI stimulates productivity growth in the host country.
Resumo:
In this paper we use a comparative perspective to explore the ways in which institutions and networks have influenced entrepreneurial development in Russia. We utilize Global Entrepreneurship Monitor (GEM) data to study the effects of the weak institutional environment in Russia on entrepreneurship, comparing it first with all available GEM country samples and second, in more detail, with Brazil and Poland. Our results suggest that Russia's institutional environment is important in explaining its relatively low levels of entrepreneurship development, where the latter is measured in terms of both number of start-ups and of existing business owners. In addition, Russia's business environment and its consequences for the role of business networks contribute to the relative advantage of entrepreneurial insiders (those already in business) to entrepreneurial outsiders (newcomers) in terms of new business start-ups.
Resumo:
Purpose – The purpose of this paper is to examine developments in the field of organizational change (OC) with reference to the context of India. It highlights the need to analyze this topic in the present Indian economic environment and discusses the main developments reported in the Indian literature on the same. Design/methodology/approach – Empirical evidence based on a qualitative analysis of a case study undertaken at a public-private partnership transformation at North Delhi Power Limited (NDPL) in India is presented. Findings – The findings focus on trust building and belongingness for the employees, establishing a high-performance orientation, quality improvements, and the resultant transformations at NDPL. The analysis indicates a number of ways by which NDPL sought to improve its efficiency in order to better adapt to the rapidly changing Indian business environment. Practical implications – Based on the findings, the paper identifies key messages for policy makers and change agents regarding how to transform companies in the rapidly changing business contexts of emerging markets such as India. Originality/value – The paper offers an in-depth analysis of OC practices in a large organization in India.
Resumo:
Detection and interpretation of adverse signals during preclinical and clinical stages of drug development inform the benefit-risk assessment that determines suitability for use in real-world situations. This review considers some recent signals associated with diabetes therapies, illustrating the difficulties in ascribing causality and evaluating absolute risk, predictability, prevention, and containment. Individual clinical trials are necessarily restricted for patient selection, number, and duration; they can introduce allocation and ascertainment bias and they often rely on biomarkers to estimate long-term clinical outcomes. In diabetes, the risk perspective is inevitably confounded by emergent comorbid conditions and potential interactions that limit therapeutic choice, hence the need for new therapies and better use of existing therapies to address the consequences of protracted glucotoxicity. However, for some therapies, the adverse effects may take several years to emerge, and it is evident that faint initial signals under trial conditions cannot be expected to foretell all eventualities. Thus, as information and experience accumulate with time, it should be accepted that benefit-risk deliberations will be refined, and adjustments to prescribing indications may become appropriate. © 2013 by the American Diabetes Association.
Resumo:
Particle breakage due to fluid flow through various geometries can have a major influence on the performance of particle/fluid processes and on the product quality characteristics of particle/fluid products. In this study, whey protein precipitate dispersions were used as a case study to investigate the effect of flow intensity and exposure time on the breakage of these precipitate particles. Computational fluid dynamic (CFD) simulations were performed to evaluate the turbulent eddy dissipation rate (TED) and associated exposure time along various flow geometries. The focus of this work is on the predictive modelling of particle breakage in particle/fluid systems. A number of breakage models were developed to relate TED and exposure time to particle breakage. The suitability of these breakage models was evaluated for their ability to predict the experimentally determined breakage of the whey protein precipitate particles. A "power-law threshold" breakage model was found to provide a satisfactory capability for predicting the breakage of the whey protein precipitate particles. The whey protein precipitate dispersions were propelled through a number of different geometries such as bends, tees and elbows, and the model accurately predicted the mean particle size attained after flow through these geometries. © 2005 Elsevier Ltd. All rights reserved.