29 resultados para Multiple correspondence analysis
Resumo:
The research compares the usefullness of four remote sensing information sources, these being LANDSAT photographic prints, LANDSAT computer compatible tapes, Metric Camera and SIR-A photographic prints. These sources provide evaluations of the catchment characteristics of the Belize and Sibun river basins in Central America. Map evaluations at 1:250,000 scale are compared to the results of the same scale, remotely sensed information sources. The values of catchment characteristics for both maps and LANDSAT prints are used in multiple regression analysis, providing flood flow formulae, after investigations to provide a suitable dependent variable discharge series are made for short term records. The use of all remotely sensed information sources in providing evaluations of catchment characteristics is discussed. LANDSAT prints and computer compatible tapes of a post flood scene are used to estimate flood distributions and volumes. These are compared to values obtained from unit hydrograph analysis, using the dependent discharge series and evaluate the probable losses from the Belize river to the floodplain, thereby assessing the accuracy of LANDSAT estimates. Information relating to flood behaviour is discussed in terms of basic image presentation as well as image processing. A cost analysis of the purchase and use of all materials is provided. Conclusions of the research indicate that LANDSAT print material may provide information suitable for regression analysis at levels of accuracy as great as those of topographic maps, that the differing information sources are uniquely applicable and that accurate estimates of flood volumes may be determined even by post flood imagery.
Resumo:
This exploratory study is concerned with the integrated appraisal of multi-storey dwelling blocks which incorporate large concrete panel systems (LPS). The first step was to look at U.K. multi-storey dwelling stock in general, and under the management of Birmingham City Council in particular. The information has been taken from the databases of three departments in the City of Birmingham, and rearranged in a new database using a suite of PC software called `PROXIMA' for clarity and analysis. One hundred of their stock were built large concrete panel system. Thirteen LPS blocks were chosen for the purpose of this study as case-studies depending mainly on the height and age factors of the block. A new integrated appraisal technique has been created for the LPS dwelling blocks, which takes into account the most physical and social factors affecting the condition and acceptability of these blocks. This appraisal technique is built up in a hierarchical form moving from the general approach to particular elements (a tree model). It comprises two main approaches; physical and social. In the physical approach, the building is viewed as a series of manageable elements and sub-elements to cover every single physical or environmental factor of the block, in which the condition of the block is analysed. A quality score system has been developed which depends mainly on the qualitative and quantitative conditions of each category in the appraisal tree model, and leads to physical ranking order of the study blocks. In the social appraisal approach, the residents' satisfaction and attitude toward their multi-storey dwelling block was analysed in relation to: a. biographical and housing related characteristics; and b. social, physical and environmental factors associated with this sort of dwelling, block and estate in general.The random sample consisted of 268 residents living in the 13 case study blocks. Data collected was analysed using frequency counts, percentages, means, standard deviations, Kendall's tue, r-correlation coefficients, t-test, analysis of variance (ANOVA) and multiple regression analysis. The analysis showed a marginally positive satisfaction and attitude towards living in the block. The five most significant factors associated with the residents' satisfaction and attitude in descending order were: the estate, in general; the service categories in the block, including heating system and lift services; vandalism; the neighbours; and the security system of the block. An important attribute of this method, is that it is relatively inexpensive to implement, especially when compared to alternatives adopted by some local authorities and the BRE. It is designed to save time, money and effort, to aid decision making, and to provide ranked priority to the multi-storey dwelling stock, in addition to many other advantages. A series of solution options to the problems of the block was sought for selection and testing before implementation. The traditional solutions have usually resulted in either demolition or costly physical maintenance and social improvement of the blocks. However, a new solution has now emerged, which is particularly suited to structurally sound units. The solution of `re-cycling' might incorporate the reuse of an entire block or part of it, by removing panels, slabs and so forth from the upper floors in order to reconstruct them as low-rise accommodations.
Resumo:
Guest editorial Ali Emrouznejad is a Senior Lecturer at the Aston Business School in Birmingham, UK. His areas of research interest include performance measurement and management, efficiency and productivity analysis as well as data mining. He has published widely in various international journals. He is an Associate Editor of IMA Journal of Management Mathematics and Guest Editor to several special issues of journals including Journal of Operational Research Society, Annals of Operations Research, Journal of Medical Systems, and International Journal of Energy Management Sector. He is in the editorial board of several international journals and co-founder of Performance Improvement Management Software. William Ho is a Senior Lecturer at the Aston University Business School. Before joining Aston in 2005, he had worked as a Research Associate in the Department of Industrial and Systems Engineering at the Hong Kong Polytechnic University. His research interests include supply chain management, production and operations management, and operations research. He has published extensively in various international journals like Computers & Operations Research, Engineering Applications of Artificial Intelligence, European Journal of Operational Research, Expert Systems with Applications, International Journal of Production Economics, International Journal of Production Research, Supply Chain Management: An International Journal, and so on. His first authored book was published in 2006. He is an Editorial Board member of the International Journal of Advanced Manufacturing Technology and an Associate Editor of the OR Insight Journal. Currently, he is a Scholar of the Advanced Institute of Management Research. Uses of frontier efficiency methodologies and multi-criteria decision making for performance measurement in the energy sector This special issue aims to focus on holistic, applied research on performance measurement in energy sector management and for publication of relevant applied research to bridge the gap between industry and academia. After a rigorous refereeing process, seven papers were included in this special issue. The volume opens with five data envelopment analysis (DEA)-based papers. Wu et al. apply the DEA-based Malmquist index to evaluate the changes in relative efficiency and the total factor productivity of coal-fired electricity generation of 30 Chinese administrative regions from 1999 to 2007. Factors considered in the model include fuel consumption, labor, capital, sulphur dioxide emissions, and electricity generated. The authors reveal that the east provinces were relatively and technically more efficient, whereas the west provinces had the highest growth rate in the period studied. Ioannis E. Tsolas applies the DEA approach to assess the performance of Greek fossil fuel-fired power stations taking undesirable outputs into consideration, such as carbon dioxide and sulphur dioxide emissions. In addition, the bootstrapping approach is deployed to address the uncertainty surrounding DEA point estimates, and provide bias-corrected estimations and confidence intervals for the point estimates. The author revealed from the sample that the non-lignite-fired stations are on an average more efficient than the lignite-fired stations. Maethee Mekaroonreung and Andrew L. Johnson compare the relative performance of three DEA-based measures, which estimate production frontiers and evaluate the relative efficiency of 113 US petroleum refineries while considering undesirable outputs. Three inputs (capital, energy consumption, and crude oil consumption), two desirable outputs (gasoline and distillate generation), and an undesirable output (toxic release) are considered in the DEA models. The authors discover that refineries in the Rocky Mountain region performed the best, and about 60 percent of oil refineries in the sample could improve their efficiencies further. H. Omrani, A. Azadeh, S. F. Ghaderi, and S. Abdollahzadeh presented an integrated approach, combining DEA, corrected ordinary least squares (COLS), and principal component analysis (PCA) methods, to calculate the relative efficiency scores of 26 Iranian electricity distribution units from 2003 to 2006. Specifically, both DEA and COLS are used to check three internal consistency conditions, whereas PCA is used to verify and validate the final ranking results of either DEA (consistency) or DEA-COLS (non-consistency). Three inputs (network length, transformer capacity, and number of employees) and two outputs (number of customers and total electricity sales) are considered in the model. Virendra Ajodhia applied three DEA-based models to evaluate the relative performance of 20 electricity distribution firms from the UK and the Netherlands. The first model is a traditional DEA model for analyzing cost-only efficiency. The second model includes (inverse) quality by modelling total customer minutes lost as an input data. The third model is based on the idea of using total social costs, including the firm’s private costs and the interruption costs incurred by consumers, as an input. Both energy-delivered and number of consumers are treated as the outputs in the models. After five DEA papers, Stelios Grafakos, Alexandros Flamos, Vlasis Oikonomou, and D. Zevgolis presented a multiple criteria analysis weighting approach to evaluate the energy and climate policy. The proposed approach is akin to the analytic hierarchy process, which consists of pairwise comparisons, consistency verification, and criteria prioritization. In the approach, stakeholders and experts in the energy policy field are incorporated in the evaluation process by providing an interactive mean with verbal, numerical, and visual representation of their preferences. A total of 14 evaluation criteria were considered and classified into four objectives, such as climate change mitigation, energy effectiveness, socioeconomic, and competitiveness and technology. Finally, Borge Hess applied the stochastic frontier analysis approach to analyze the impact of various business strategies, including acquisition, holding structures, and joint ventures, on a firm’s efficiency within a sample of 47 natural gas transmission pipelines in the USA from 1996 to 2005. The author finds that there were no significant changes in the firm’s efficiency by an acquisition, and there is a weak evidence for efficiency improvements caused by the new shareholder. Besides, the author discovers that parent companies appear not to influence a subsidiary’s efficiency positively. In addition, the analysis shows a negative impact of a joint venture on technical efficiency of the pipeline company. To conclude, we are grateful to all the authors for their contribution, and all the reviewers for their constructive comments, which made this special issue possible. We hope that this issue would contribute significantly to performance improvement of the energy sector.
Resumo:
The ageing process is strongly influenced by nutrient balance, such that modest calorie restriction (CR) extends lifespan in mammals. Irisin, a newly described hormone released from skeletal muscles after exercise, may induce CR-like effects by increasing adipose tissue energy expenditure. Using telomere length as a marker of ageing, this study investigates associations between body composition, plasma irisin levels and peripheral blood mononuclear cell telomere length in healthy, non-obese individuals. Segmental body composition (by bioimpedance), telomere length and plasma irisin levels were assessed in 81 healthy individuals (age 43∈±∈15.8 years, BMI 24.3∈±∈2.9 kg/m2). Data showed significant correlations between log-transformed relative telomere length and the following: age (p∈<∈0.001), height (p∈=∈0.045), total body fat percentage (p∈=∈0.031), abdominal fat percentage (p∈=∈0.038) , visceral fat level (p∈<∈0.001), plasma leptin (p∈=∈0.029) and plasma irisin (p∈=∈0.011), respectively. Multiple regression analysis using backward elimination revealed that relative telomere length can be predicted by age (b∈=∈-0.00735, p∈=∈0.001) and plasma irisin levels (b∈=∈0.04527, p∈=∈0.021). These data support the view that irisin may have a role in the modulation of both energy balance and the ageing process. © 2014 The Author(s).
Resumo:
OBJECTIVE: To investigate laboratory evidence of abnormal angiogenesis, hemorheologic factors, endothelial damage/dysfunction, and age-related macular degeneration (ARMD). DESIGN: Comparative cross-sectional study. PARTICIPANTS: We studied 78 subjects (26 men and 52 women; mean age 74 years; standard deviation [SD] 9.0) with ARMD attending a specialist referral clinic. Subjects were compared with 25 healthy controls (mean age, 71 years; SD, 11). INTERVENTION AND OUTCOME MEASURES: Levels of vascular endothelial growth factor (VEGF, an index of angiogenesis), hemorheologic factors (plasma viscosity, hematocrit, white cell count, hemoglobin, platelets), fibrinogen (an index of rheology and hemostasis), and von Willebrand factor (a marker of endothelial dysfunction) were measured. RESULTS: Median plasma VEGF (225 vs. 195 pg/ml, P = 0.019) and mean von Willebrand factor (124 vs. 99 IU/dl, P = 0.0004) were greater in ARMD subjects than the controls. Mean plasma fibrinogen and plasma viscosity levels were also higher in the subjects (both P < 0.0001). There were no significant differences in other indices between cases and controls. When "dry" (drusen, atrophy, n = 28) and "exudative" (n = 50) ARMD subjects were compared, there was no significant differences in VEGF, fibrinogen, viscosity, or von Willebrand factor levels. There were no significant correlations between the measured parameters. Stepwise multiple regression analysis did not demonstrate any significant clinical predictors (age, gender, smoking, body mass index, history of vascular disease, or hypertension) for plasma VEGF or fibrinogen levels, although smoking status was a predictor of plasma von Willebrand factor levels (P < 0.05). CONCLUSIONS: This study suggests an association between markers of angiogenesis (VEGF), hemorheologic factors, hemostasis, endothelial dysfunction, and ARMD. The interaction between abnormal angiogenesis and the components of Virchow's triad for thrombogenesis may in part contribute to the pathogenesis of ARMD.
Resumo:
Aims: Obesity and Type 2 diabetes are associated with accelerated ageing. The underlying mechanisms behind this, however, are poorly understood. In this study, we investigated the association between circulating irisin - a novel my okine involved in energy regulation - and telomere length (TL) (a marker of aging) in healthy individuals and individuals with Type 2 diabetes. Methods: Eighty-two healthy people and 67 subjects with Type 2 diabetes were recruited to this cross-sectional study. Anthropometric measurements including body composition measured by biompedance were recorded. Plasma irisin was measured by ELISA on a fasted blood sample. Relative TL was determined using real-time PCR. Associations between anthropometric measures and irisin and TL were explored using Pearson’s bivariate correlations. Multiple regression was used to explore all the significant predictors of TL using backward elimination. Results: In healthy individuals chronological age was a strong negative predictor of TL (=0.552, p < 0.001). Multiple regression analysis using backward elimination (excluding age) revealed the greater relative TL could be predicted by greater total muscle mass(b = 0.046, p = 0.001), less visceral fat (b = =0.183, p < 0.001)and higher plasma irisin levels (b = 0.01, p = 0.027). There were no significant associations between chronological age, plasmairisin, anthropometric measures and TL in patients with Type 2diabetes (p > 0.1). Conclusion: These data support the view that body composition and plasma irisin may have a role in modulation of energy balance and the aging process in healthy individuals. This relationship is altered in individuals with Type 2 diabetes.
Resumo:
This thesis addressed the problem of risk analysis in mental healthcare, with respect to the GRiST project at Aston University. That project provides a risk-screening tool based on the knowledge of 46 experts, captured as mind maps that describe relationships between risks and patterns of behavioural cues. Mind mapping, though, fails to impose control over content, and is not considered to formally represent knowledge. In contrast, this thesis treated GRiSTs mind maps as a rich knowledge base in need of refinement; that process drew on existing techniques for designing databases and knowledge bases. Identifying well-defined mind map concepts, though, was hindered by spelling mistakes, and by ambiguity and lack of coverage in the tools used for researching words. A novel use of the Edit Distance overcame those problems, by assessing similarities between mind map texts, and between spelling mistakes and suggested corrections. That algorithm further identified stems, the shortest text string found in related word-forms. As opposed to existing approaches’ reliance on built-in linguistic knowledge, this thesis devised a novel, more flexible text-based technique. An additional tool, Correspondence Analysis, found patterns in word usage that allowed machines to determine likely intended meanings for ambiguous words. Correspondence Analysis further produced clusters of related concepts, which in turn drove the automatic generation of novel mind maps. Such maps underpinned adjuncts to the mind mapping software used by GRiST; one such new facility generated novel mind maps, to reflect the collected expert knowledge on any specified concept. Mind maps from GRiST are stored as XML, which suggested storing them in an XML database. In fact, the entire approach here is ”XML-centric”, in that all stages rely on XML as far as possible. A XML-based query language allows user to retrieve information from the mind map knowledge base. The approach, it was concluded, will prove valuable to mind mapping in general, and to detecting patterns in any type of digital information.
Resumo:
The thesis examines Kuhn's (1962, 1970) concept of paradigm, assesses how it is employed for mapping intellectual terrain in the social sciences, and evaluates it's use in research based on multiple theory positions. In so doing it rejects both the theses of total paradigm 'incommensurability' (Kuhn, 1962), and also of liberal 'translation' (Popper, 1970), in favour of a middle ground through the 'language-game of everyday life' (Wittgenstein, 1953). The thesis ultimately argues for the possibility of being 'trained-into' new paradigms, given the premise that 'unorganised experience cannot order perception' (Phillips, 1977). In conducting multiple paradigm research the analysis uses the Burrell and Morgan (1979) model for examining the work organisation of a large provincial fire Service. This analysis accounts for firstly, a 'functionalist' assessment of work design, demonstrating inter alia the decrease in reported motivation with length of service; secondly, an 'interpretive' portrayal of the daily accomplishment of task routines, highlighting the discretionary and negotiated nature of the day's events; thirdly, a 'radical humanist' analysis of workplace ideology, demonstrating the hegemonic role of officer training practices; and finally, a 'radical structuralist' description of the labour process, focusing on the establishment of a 'normal working day'. Although the argument is made for the possibility of conducting multiple paradigm research, the conclusion stresses the many institutional pressures serving to offset development.
Resumo:
This study proposes an integrated analytical framework for effective management of project risks using combined multiple criteria decision-making technique and decision tree analysis. First, a conceptual risk management model was developed through thorough literature review. The model was then applied through action research on a petroleum oil refinery construction project in the Central part of India in order to demonstrate its effectiveness. Oil refinery construction projects are risky because of technical complexity, resource unavailability, involvement of many stakeholders and strict environmental requirements. Although project risk management has been researched extensively, practical and easily adoptable framework is missing. In the proposed framework, risks are identified using cause and effect diagram, analysed using the analytic hierarchy process and responses are developed using the risk map. Additionally, decision tree analysis allows modelling various options for risk response development and optimises selection of risk mitigating strategy. The proposed risk management framework could be easily adopted and applied in any project and integrated with other project management knowledge areas.
Resumo:
With the features of low-power and flexible networking capabilities IEEE 802.15.4 has been widely regarded as one strong candidate of communication technologies for wireless sensor networks (WSNs). It is expected that with an increasing number of deployments of 802.15.4 based WSNs, multiple WSNs could coexist with full or partial overlap in residential or enterprise areas. As WSNs are usually deployed without coordination, the communication could meet significant degradation with the 802.15.4 channel access scheme, which has a large impact on system performance. In this thesis we are motivated to investigate the effectiveness of 802.15.4 networks supporting WSN applications with various environments, especially when hidden terminals are presented due to the uncoordinated coexistence problem. Both analytical models and system level simulators are developed to analyse the performance of the random access scheme specified by IEEE 802.15.4 medium access control (MAC) standard for several network scenarios. The first part of the thesis investigates the effectiveness of single 802.15.4 network supporting WSN applications. A Markov chain based analytic model is applied to model the MAC behaviour of IEEE 802.15.4 standard and a discrete event simulator is also developed to analyse the performance and verify the proposed analytical model. It is observed that 802.15.4 networks could sufficiently support most WSN applications with its various functionalities. After the investigation of single network, the uncoordinated coexistence problem of multiple 802.15.4 networks deployed with communication range fully or partially overlapped are investigated in the next part of the thesis. Both nonsleep and sleep modes are investigated with different channel conditions by analytic and simulation methods to obtain the comprehensive performance evaluation. It is found that the uncoordinated coexistence problem can significantly degrade the performance of 802.15.4 networks, which is unlikely to satisfy the QoS requirements for many WSN applications. The proposed analytic model is validated by simulations which could be used to obtain the optimal parameter setting before WSNs deployments to eliminate the interference risks.
Resumo:
The accurate in silico identification of T-cell epitopes is a critical step in the development of peptide-based vaccines, reagents, and diagnostics. It has a direct impact on the success of subsequent experimental work. Epitopes arise as a consequence of complex proteolytic processing within the cell. Prior to being recognized by T cells, an epitope is presented on the cell surface as a complex with a major histocompatibility complex (MHC) protein. A prerequisite therefore for T-cell recognition is that an epitope is also a good MHC binder. Thus, T-cell epitope prediction overlaps strongly with the prediction of MHC binding. In the present study, we compare discriminant analysis and multiple linear regression as algorithmic engines for the definition of quantitative matrices for binding affinity prediction. We apply these methods to peptides which bind the well-studied human MHC allele HLA-A*0201. A matrix which results from combining results of the two methods proved powerfully predictive under cross-validation. The new matrix was also tested on an external set of 160 binders to HLA-A*0201; it was able to recognize 135 (84%) of them.
Resumo:
We present an experimental and numerical study of transversely loaded uniform fibre-Bragg gratings. A novel loading configuration is described, producing pressure-induced spectral holes in an initially strong uniform grating. The birefringence properties of these gratings are analysed. It is shown that the frequency splitting of the two spectral holes, corresponding to two orthogonal polarisation states, can be adjusted precisely using this loading configuration. We finally demonstrate a new and simple scheme to induce multiple spectral holes in the stop-band. © 2003 Elsevier Science B.V. All rights reserved.
Resumo:
In this concluding chapter, we bring together the threads and reflections on the chapters contained in this text and show how they relate to multi-level issues. The book has focused on the world of Human Resource Management (HRM) and the systems and practices it must put in place to foster innovation. Many of the contributions argue that in order to bring innovation about, organisations have to think carefully about the way in which they will integrate what is, in practice, organisationally relevant — but socially distributed — knowledge. They need to build a series of knowledge-intensive activities and networks, both within their own boundaries and across other important external inter-relationships. In so doing, they help to co-ordinate important information structures. They have, in effect, to find ways of enabling people to collaborate with each other at lower cost, by reducing both the costs of their co-ordination and the levels of unproductive search activity. They have to engineer these behaviours by reducing the risks for people that might be associated with incorrect ideas and help individuals, teams and business units to advance incomplete ideas that are so often difficult to codify. In short, a range of intangible assets must flow more rapidly throughout the organisation and an appropriate balance must be found between the rewards and incentives associated with creativity, novelty and innovation, versus the risks that innovation may also bring.