14 resultados para Ligand-based methodologies

em Aston University Research Archive


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This review will discuss the use of manual grading scales, digital photography, and automated image analysis in the quantification of fundus changes caused by age-related macular disease. Digital imaging permits processing of images for enhancement, comparison, and feature quantification, and these techniques have been investigated for automated drusen analysis. The accuracy of automated analysis systems has been enhanced by the incorporation of interactive elements, such that the user is able to adjust the sensitivity of the system, or manually add and remove pixels. These methods capitalize on both computer and human image feature recognition and the advantage of computer-based methodologies for quantification. The histogram-based adaptive local thresholding system is able to extract useful information from the image without being affected by the presence of other structures. More recent developments involve compensation for fundus background reflectance, which has most recently been combined with the Otsu method of global thresholding. This method is reported to provide results comparable with manual stereo viewing. Developments in this area are likely to encourage wider use of automated techniques. This will make the grading of photographs easier and cheaper for clinicians and researchers. © 2007 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Few works address methodological issues of how to conduct strategy-as-practice research and even fewer focus on how to analyse the subsequent data in ways that illuminate strategy as an everyday, social practice. We address this gap by proposing a quantitative method for analysing observational data, which can complement more traditional qualitative methodologies. We propose that rigorous but context-sensitive coding of transcripts can render everyday practice analysable statistically. Such statistical analysis provides a means for analytically representing patterns and shifts within the mundane, repetitive elements through which practice is accomplished. We call this approach the Event Database (EDB) and it consists of five basic coding categories that help us capture the stream of practice. Indexing codes help to index or categorise the data, in order to give context and offer some basic information about the event under discussion. Indexing codes are descriptive codes, which allow us to catalogue and classify events according to their assigned characteristics. Content codes are to do with the qualitative nature of the event; this is the essence of the event. It is a description that helps to inform judgements about the phenomenon. Nature codes help us distinguish between discursive and tangible events. We include this code to acknowledge that some events differ qualitatively from other events. Type events are codes abstracted from the data in order to help us classify events based on their description or nature. This involves significantly more judgement than the index codes but consequently is also more meaningful. Dynamics codes help us capture some of the movement or fluidity of events. This category has been included to let us capture the flow of activity over time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This comparative study considers the main causative factors for change in recent years in the teaching of modern languages in England and France and seeks to contribute, in a general sense, to the understanding of change in comparable institutions. In England by 1975 the teaching of modern languages in the comprehensive schools was seen to be inappropriate to the needs of children of the whole ability-range. A combination of the external factor of the Council of Europe initiative in devising a needs-based learning approach for adult learners, and the internal factor of teacher-based initiatives in developing a graded-objectives learning approach for the less-able, has reversed this situation to some extent. The study examines and evaluates this reversal, and, in addition, assesses teachers' attitudes towards, and understanding of, the changes involved. In France the imposition of `la reforme Haby' in 1977 and the creation of `le college unique' were the main external factors for change. The subsequent failure of the reform and the socialist government's support of decentralisation policies returning the initiative for renewal to schools are examined and evaluated, as are the internal factors for changes in language-teaching - `groupes de niveau' and the creation of `equipes pedagogiques'. In both countries changes in the function of examinations at 15/16 plus are examined. The final chapter compared the changes in both education systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software development methodologies are becoming increasingly abstract, progressing from low level assembly and implementation languages such as C and Ada, to component based approaches that can be used to assemble applications using technologies such as JavaBeans and the .NET framework. Meanwhile, model driven approaches emphasise the role of higher level models and notations, and embody a process of automatically deriving lower level representations and concrete software implementations. The relationship between data and software is also evolving. Modern data formats are becoming increasingly standardised, open and empowered in order to support a growing need to share data in both academia and industry. Many contemporary data formats, most notably those based on XML, are self-describing, able to specify valid data structure and content, and can also describe data manipulations and transformations. Furthermore, while applications of the past have made extensive use of data, the runtime behaviour of future applications may be driven by data, as demonstrated by the field of dynamic data driven application systems. The combination of empowered data formats and high level software development methodologies forms the basis of modern game development technologies, which drive software capabilities and runtime behaviour using empowered data formats describing game content. While low level libraries provide optimised runtime execution, content data is used to drive a wide variety of interactive and immersive experiences. This thesis describes the Fluid project, which combines component based software development and game development technologies in order to define novel component technologies for the description of data driven component based applications. The thesis makes explicit contributions to the fields of component based software development and visualisation of spatiotemporal scenes, and also describes potential implications for game development technologies. The thesis also proposes a number of developments in dynamic data driven application systems in order to further empower the role of data in this field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The primary objective of this research was to understand what kinds of knowledge and skills people use in `extracting' relevant information from text and to assess the extent to which expert systems techniques could be applied to automate the process of abstracting. The approach adopted in this thesis is based on research in cognitive science, information science, psycholinguistics and textlinguistics. The study addressed the significance of domain knowledge and heuristic rules by developing an information extraction system, called INFORMEX. This system, which was implemented partly in SPITBOL, and partly in PROLOG, used a set of heuristic rules to analyse five scientific papers of expository type, to interpret the content in relation to the key abstract elements and to extract a set of sentences recognised as relevant for abstracting purposes. The analysis of these extracts revealed that an adequate abstract could be generated. Furthermore, INFORMEX showed that a rule based system was a suitable computational model to represent experts' knowledge and strategies. This computational technique provided the basis for a new approach to the modelling of cognition. It showed how experts tackle the task of abstracting by integrating formal knowledge as well as experiential learning. This thesis demonstrated that empirical and theoretical knowledge can be effectively combined in expert systems technology to provide a valuable starting approach to automatic abstracting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Four novel mononuclear coordination compounds namely: [Fe(Hthpy)2](SO4)1/2·3.5H2O 1, [Fe(Hthpy)2]NO3·3H2O 2, [Fe(H2mthpy)2](CH3C6H4SO3)3·CH3CH2OH 3 and [Fe(Hethpy)(ethpy)]·8H2O 4, (H2thpy = pyridoxalthiosemicarbazone, H2mthpy = pyridoxal-4-methylthiosemicarbazone, H2ethpy = pyridoxal-4-ethylthiosemicarbazone), were synthesized in the absence or presence of organic base, Et3N and NH3. Compounds 1 and 2 are monocationic, and were prepared using the singly deprotonated form of pyridoxalthiosemicarbazone. Both compounds crystallise in the monoclinic system, C2/c and P21/c space group for 1 and 2, respectively. Complex 3 is tricationic, it is formed with neutral bis(ligand) complex and possesses an interesting 3D channel architecture, the unit cell is triclinic, P1 space group. For complex 4, the pH value plays an important role during its synthesis; 4 is neutral and crystallises with two inequivalent forms of the ligand: the singly and the doubly deprotonated chelate of H2ethpy, the unit cell is monoclinic, C2/c space group. Notably, in 1 and 4, there is an attractive infinite three dimensional hydrogen bonding network in the crystal lattice. Magnetic measurements of 1 and 4 revealed that a rather steep spin transition from the low spin to high spin Fe(III) states occurs above 300 K in the first heating step. This transition is accompanied by the elimination of solvate molecules and thus, stabilizes the high spin form due to the breaking of hydrogen bonding networks; compared to 2 and 3, which keep their low spin state up to 400 K.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study here highlights the potential that analytical methods based on Knowledge Discovery in Databases (KDD) methodologies have to aid both the resolution of unstructured marketing/business problems and the process of scholarly knowledge discovery. The authors present and discuss the application of KDD in these situations prior to the presentation of an analytical method based on fuzzy logic and evolutionary algorithms, developed to analyze marketing databases and uncover relationships among variables. A detailed implementation on a pre-existing data set illustrates the method. © 2012 Published by Elsevier Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Few works address methodological issues of how to conduct strategy-as-practice research and even fewer focus on how to analyse the subsequent data in ways that illuminate strategy as an everyday, social practice. We address this gap by proposing a quantitative method for analysing observational data, which can complement more traditional qualitative methodologies. We propose that rigorous but context-sensitive coding of transcripts can render everyday practice analysable statistically. Such statistical analysis provides a means for analytically representing patterns and shifts within the mundane, repetitive elements through which practice is accomplished. We call this approach the Event Database (EDB) and it consists of five basic coding categories that help us capture the stream of practice. Indexing codes help to index or categorise the data, in order to give context and offer some basic information about the event under discussion. Indexing codes are descriptive codes, which allow us to catalogue and classify events according to their assigned characteristics. Content codes are to do with the qualitative nature of the event; this is the essence of the event. It is a description that helps to inform judgements about the phenomenon. Nature codes help us distinguish between discursive and tangible events. We include this code to acknowledge that some events differ qualitatively from other events. Type events are codes abstracted from the data in order to help us classify events based on their description or nature. This involves significantly more judgement than the index codes but consequently is also more meaningful. Dynamics codes help us capture some of the movement or fluidity of events. This category has been included to let us capture the flow of activity over time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction - In recent years much progress has been made in the development of tools for systems biology to study the levels of mRNA and protein, and their interactions within cells. However, few multiplexed methodologies are available to study cell signalling directly at the transcription factor level. Methods - Here we describe a sensitive, plasmid-based RNA reporter methodology to study transcription factor activation in mammalian cells, and apply this technology to profiling 60 transcription factors in parallel. The methodology uses two robust and easily accessible detection platforms; quantitative real-time PCR for quantitative analysis and DNA microarrays for parallel, higher throughput analysis. Findings - We test the specificity of the detection platforms with ten inducers and independently validate the transcription factor activation. Conclusions - We report a methodology for the multiplexed study of transcription factor activation in mammalian cells that is direct and not theoretically limited by the number of available reporters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Relationship-based approaches to leadership (e.g., Leader–Member Exchange theory) currently represent one of the most popular approaches to understanding workplace leadership. Although the concept of “relationship” is central to these approaches, generally this has not been well articulated and is often conceptualized simply in terms of relationship quality between the leader and the follower. In contrast, research in the wider relationship science domain provides a more detailed exposition of relationships and how they form and develop. We propose that research and methodology developed in relationship science (i.e., close relationships) can enhance understanding of the leader–follower relationship and therefore advance theory in this area. To address this issue, we organize our review in two areas. First, we examine how a social cognitive approach to close relationships can benefit an understanding of the leader–follower relationship (in terms of structure, content, and processes). Second, we show how the research designs and methodologies that have been developed in relationship science can be applied to understand better the leader–follower relationship. The cross-fertilization of research from the close relationships literature to understanding the leader–follower relationship provides new insights into leadership processes and potential avenues for further research. Copyright © 2013 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Guest editorial Ali Emrouznejad is a Senior Lecturer at the Aston Business School in Birmingham, UK. His areas of research interest include performance measurement and management, efficiency and productivity analysis as well as data mining. He has published widely in various international journals. He is an Associate Editor of IMA Journal of Management Mathematics and Guest Editor to several special issues of journals including Journal of Operational Research Society, Annals of Operations Research, Journal of Medical Systems, and International Journal of Energy Management Sector. He is in the editorial board of several international journals and co-founder of Performance Improvement Management Software. William Ho is a Senior Lecturer at the Aston University Business School. Before joining Aston in 2005, he had worked as a Research Associate in the Department of Industrial and Systems Engineering at the Hong Kong Polytechnic University. His research interests include supply chain management, production and operations management, and operations research. He has published extensively in various international journals like Computers & Operations Research, Engineering Applications of Artificial Intelligence, European Journal of Operational Research, Expert Systems with Applications, International Journal of Production Economics, International Journal of Production Research, Supply Chain Management: An International Journal, and so on. His first authored book was published in 2006. He is an Editorial Board member of the International Journal of Advanced Manufacturing Technology and an Associate Editor of the OR Insight Journal. Currently, he is a Scholar of the Advanced Institute of Management Research. Uses of frontier efficiency methodologies and multi-criteria decision making for performance measurement in the energy sector This special issue aims to focus on holistic, applied research on performance measurement in energy sector management and for publication of relevant applied research to bridge the gap between industry and academia. After a rigorous refereeing process, seven papers were included in this special issue. The volume opens with five data envelopment analysis (DEA)-based papers. Wu et al. apply the DEA-based Malmquist index to evaluate the changes in relative efficiency and the total factor productivity of coal-fired electricity generation of 30 Chinese administrative regions from 1999 to 2007. Factors considered in the model include fuel consumption, labor, capital, sulphur dioxide emissions, and electricity generated. The authors reveal that the east provinces were relatively and technically more efficient, whereas the west provinces had the highest growth rate in the period studied. Ioannis E. Tsolas applies the DEA approach to assess the performance of Greek fossil fuel-fired power stations taking undesirable outputs into consideration, such as carbon dioxide and sulphur dioxide emissions. In addition, the bootstrapping approach is deployed to address the uncertainty surrounding DEA point estimates, and provide bias-corrected estimations and confidence intervals for the point estimates. The author revealed from the sample that the non-lignite-fired stations are on an average more efficient than the lignite-fired stations. Maethee Mekaroonreung and Andrew L. Johnson compare the relative performance of three DEA-based measures, which estimate production frontiers and evaluate the relative efficiency of 113 US petroleum refineries while considering undesirable outputs. Three inputs (capital, energy consumption, and crude oil consumption), two desirable outputs (gasoline and distillate generation), and an undesirable output (toxic release) are considered in the DEA models. The authors discover that refineries in the Rocky Mountain region performed the best, and about 60 percent of oil refineries in the sample could improve their efficiencies further. H. Omrani, A. Azadeh, S. F. Ghaderi, and S. Abdollahzadeh presented an integrated approach, combining DEA, corrected ordinary least squares (COLS), and principal component analysis (PCA) methods, to calculate the relative efficiency scores of 26 Iranian electricity distribution units from 2003 to 2006. Specifically, both DEA and COLS are used to check three internal consistency conditions, whereas PCA is used to verify and validate the final ranking results of either DEA (consistency) or DEA-COLS (non-consistency). Three inputs (network length, transformer capacity, and number of employees) and two outputs (number of customers and total electricity sales) are considered in the model. Virendra Ajodhia applied three DEA-based models to evaluate the relative performance of 20 electricity distribution firms from the UK and the Netherlands. The first model is a traditional DEA model for analyzing cost-only efficiency. The second model includes (inverse) quality by modelling total customer minutes lost as an input data. The third model is based on the idea of using total social costs, including the firm’s private costs and the interruption costs incurred by consumers, as an input. Both energy-delivered and number of consumers are treated as the outputs in the models. After five DEA papers, Stelios Grafakos, Alexandros Flamos, Vlasis Oikonomou, and D. Zevgolis presented a multiple criteria analysis weighting approach to evaluate the energy and climate policy. The proposed approach is akin to the analytic hierarchy process, which consists of pairwise comparisons, consistency verification, and criteria prioritization. In the approach, stakeholders and experts in the energy policy field are incorporated in the evaluation process by providing an interactive mean with verbal, numerical, and visual representation of their preferences. A total of 14 evaluation criteria were considered and classified into four objectives, such as climate change mitigation, energy effectiveness, socioeconomic, and competitiveness and technology. Finally, Borge Hess applied the stochastic frontier analysis approach to analyze the impact of various business strategies, including acquisition, holding structures, and joint ventures, on a firm’s efficiency within a sample of 47 natural gas transmission pipelines in the USA from 1996 to 2005. The author finds that there were no significant changes in the firm’s efficiency by an acquisition, and there is a weak evidence for efficiency improvements caused by the new shareholder. Besides, the author discovers that parent companies appear not to influence a subsidiary’s efficiency positively. In addition, the analysis shows a negative impact of a joint venture on technical efficiency of the pipeline company. To conclude, we are grateful to all the authors for their contribution, and all the reviewers for their constructive comments, which made this special issue possible. We hope that this issue would contribute significantly to performance improvement of the energy sector.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The potential benefits of implementing Component-Based Development (CBD) methodologies in a globally distributed environment are many. Lessons from the aeronautics, automotive, electronics and computer hardware industries, in which Component-Based (CB) architectures have been successfully employed for setting up globally distributed design and production activities, have consistently shown that firms have managed to increase the rate of reused components and sub-assemblies, and to speed up the design and production process of new products.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A significant forum of scholarly and practitioner-based research has developed in recent years that has sought both to theorize upon and empirically measure the competitiveness of regions. However, the disparate and fragmented nature of this work has led to the lack of a substantive theoretical foundation underpinning the various analyses and measurement methodologies employed. The aim of this paper is to place the regional competitiveness discourse within the context of theories of economic growth, and more particularly, those concerning regional economic growth. It is argued that regional competitiveness models are usually implicitly constructed in the lineage of endogenous growth frameworks, whereby deliberate investments in factors such as human capital and knowledge are considered to be key drivers of growth differentials. This leads to the suggestion that regional competitiveness can be usefully defined as the capacity and capability of regions to achieve economic growth relative to other regions at a similar overall stage of economic development, which will usually be within their own nation or continental bloc. The paper further assesses future avenues for theoretical and methodological exploration, highlighting the role of institutions, resilience and, well-being in understanding how the competitiveness of regions influences their long-term evolution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There has been an increasing interest in the use of agent-based simulation and some discussion of the relative merits of this approach as compared to discrete-event simulation. There are differing views on whether an agent-based simulation offers capabilities that discrete-event cannot provide or whether all agent-based applications can at least in theory be undertaken using a discrete-event approach. This paper presents a simple agent-based NetLogo model and corresponding discrete-event versions implemented in the widely used ARENA software. The two versions of the discrete-event model presented use a traditional process flow approach normally adopted in discrete-event simulation software and also an agent-based approach to the model build. In addition a real-time spatial visual display facility is provided using a spreadsheet platform controlled by VBA code embedded within the ARENA model. Initial findings from this investigation are that discrete-event simulation can indeed be used to implement agent-based models and with suitable integration elements such as VBA provide the spatial displays associated with agent-based software.