31 resultados para multiple discriminant analysis


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose: To investigate the correlation between tests of visual function and perceived visual ability recorded with a 'quality-of-life' questionnaire for patients with central field loss. Method: 12 females and 7 males (mean age = 53.1 years; Range = 23 - 80 years) with subfoveal neovascular membranes underwent a comprehensive assessment of visual function. Tests included unaided distance vision, high and low contrast distance logMAR visual acuity (VA), Pelli-Robson contrast senstivity (at 1m), near logMAR word VA and text reading speed. All tests were done both monocularly and binocularly. The patients also completed a 28 point questionnaire separated into a 'core' section consisting of general questions about perceived visual function and a 'module' section with specific questions on reading function. Results: Step-wise multiple regression analysis was used to determine which visual function tests were correlated with the patients's perceived visual function and to rank them in order of importance. The visual function test that explains most of the variance in both 'core' score (66%0 and the 'module' score (68%) of the questionnaire is low contrast VA in the better eye (P<0.001 in both cases). Further, the module score also accounts for a significant proportion of the variance (P<0.01) of the distance logMAR VA in both the better and worse eye, and the near logMAR in both the better eye and binocularly. Conclusions: The best predictor of both perceived reading ability and of general perceived visual ability in this study is low contrast logMAR VA. The results highlight that distance VA is not the only relevant measure of visual fucntion in relation to a patients's perceived visual performance and should not be considered a determinant of surgical or management success.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose: Optometrists are becoming more integrally involved in the diagnosis of and care for glaucoma patients in the UK. The correlation of apparent change in non contact tonometry (NCT) IOP measurement and change in other ocular parameters such as refractive error, corneal curvature, corneal thickness and treatment zone size (data available to optometrists after LASIK) would facilitate care of these patients. Setting: A UK Laser Eye Clinic. Methods: This is a retrospective study study of 200 sequential eyes with myopia with or without astigmatism which underwent LASIK using a Hansatome and an Alcon LADARvision 4000 excimer laser. Refraction keratometry, pachymetry and NCT IOP mesurements were taken before treatmebnt and agian 3 months after treatment. The relationship between these variables anfd teh treatment zones were studied using stepwise multiple regression analysis. Results: There was a mean difference of 5.54mmHg comnparing pre and postoperative NCT IOP. IOP change correlates with refractive error change (P < 0.001), preoperative corneal thickness (P < 0.001) and treatment zone size (P = 0.047). Preoperative corneal thickness correlates with preoperative IOP (P < 0.001) and postoperative IOP (P < 0.001). Using these correlations, the measured difference in NCT IIOP can be predicted preoperatively or postoperatively using derived equations.Conclusion: There is a significant reduction in measured NCT IOP after LASIK. The amount of reduction can be calculated using data acquired by optometrists. This is helpful for opthalmologists and optometrists who co-manage glaucoma patients who have had LASIK or with glaucoma pateints who are consideraing having LASIK.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis describes the development of a complete data visualisation system for large tabular databases, such as those commonly found in a business environment. A state-of-the-art 'cyberspace cell' data visualisation technique was investigated and a powerful visualisation system using it was implemented. Although allowing databases to be explored and conclusions drawn, it had several drawbacks, the majority of which were due to the three-dimensional nature of the visualisation. A novel two-dimensional generic visualisation system, known as MADEN, was then developed and implemented, based upon a 2-D matrix of 'density plots'. MADEN allows an entire high-dimensional database to be visualised in one window, while permitting close analysis in 'enlargement' windows. Selections of records can be made and examined, and dependencies between fields can be investigated in detail. MADEN was used as a tool for investigating and assessing many data processing algorithms, firstly data-reducing (clustering) methods, then dimensionality-reducing techniques. These included a new 'directed' form of principal components analysis, several novel applications of artificial neural networks, and discriminant analysis techniques which illustrated how groups within a database can be separated. To illustrate the power of the system, MADEN was used to explore customer databases from two financial institutions, resulting in a number of discoveries which would be of interest to a marketing manager. Finally, the database of results from the 1992 UK Research Assessment Exercise was analysed. Using MADEN allowed both universities and disciplines to be graphically compared, and supplied some startling revelations, including empirical evidence of the 'Oxbridge factor'.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The thesis began as a study of new firm formation. Preliminary research suggested that infant death rate was considered to be a closely related problem and the search was for a theory of new firm formation which would explain both. The thesis finds theories of exit and entry inadequate in this respect and focusses instead on theories of entrepreneurship, particularly those which concentrate on entrepreneurship as an agent of change. The role of information is found to be fundamental to economic change and an understanding of information generation and dissemination and the nature and direction of information flows is postulated to lead coterminously to an understanding of entrepreneurhsip and economic change. The economics of information is applied to theories of entrepreneurhsip and some testable hypotheses are derived. The testing relies on etablishing and measuring the information bases of the founders of new firms and then testing for certain hypothesised differences between the information bases of survivors and non-survivors. No theory of entrepreneurship is likely to be straightforwardly testable and many postulates have to be established to bring the theory to a testable stage. A questionnaire is used to gather information from a sample of firms taken from a new micro-data set established as part of the work of the thesis. Discriminant Analysis establishes the variables which best distinguish between survivors and non-survivors. The variables which emerge as important discriminators are consistent with the theory which the analysis is testing. While there are alternative interpretations of the important variables, collective consistency with the theory under test is established. The thesis concludes with an examination of the implications of the theory for policy towards stimulating new firm formation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Since the Second World War a range of policies have been implemented by central and local government agencies, with a view to improving accessibility to facilities, housing and employment opportunities within rural areas. It has been suggested that a lack of reasonable access to a range of such facilities and opportunities constitutes a key aspect of deprivation or disadvantage for rural residents. Despite considerable interest, very few attempts have been made to assess the nature and incidence of this disadvantage or the reaction of different sections of the population of rural areas to it. Moreover, almost all previous assessments have relied on so-called 'objective' measures of accessibility and disadvantage and failed to consider the relationship between such measures and 'subjective' measures such as individual perceptions. It is this gap in knowledge that the research described in this thesis has addressed. Following a critical review of relevant literature the thesis describes the way in which data on 'objective' and 'subjective' indicators of accessibility and behavioural responses to accessibility problems was collected, in six case study areas in Shropshire. Analysis of this data indicates that planning and other government policies have failed to significantly improve rural resident's accessibility to their basic requirements, and may in some cases have exacerbated it, and that as a result certain sections of the rural population are relatively disadvantaged. Moreover, analysis shows that .certain aspects of individual subjective' assessments of such accessibility disadvantage are significantly associated with more easily-obtained 'objective' measures. By using discriminant analysis the research demonstrates that it is possible to predict the likely levels of satisfaction with access to facilities from a range of 'objective' measures. The research concludes by highlighting the potential practical applications of such indicators in policy formulation, policy appraisal and policy evaluation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The research compares the usefullness of four remote sensing information sources, these being LANDSAT photographic prints, LANDSAT computer compatible tapes, Metric Camera and SIR-A photographic prints. These sources provide evaluations of the catchment characteristics of the Belize and Sibun river basins in Central America. Map evaluations at 1:250,000 scale are compared to the results of the same scale, remotely sensed information sources. The values of catchment characteristics for both maps and LANDSAT prints are used in multiple regression analysis, providing flood flow formulae, after investigations to provide a suitable dependent variable discharge series are made for short term records. The use of all remotely sensed information sources in providing evaluations of catchment characteristics is discussed. LANDSAT prints and computer compatible tapes of a post flood scene are used to estimate flood distributions and volumes. These are compared to values obtained from unit hydrograph analysis, using the dependent discharge series and evaluate the probable losses from the Belize river to the floodplain, thereby assessing the accuracy of LANDSAT estimates. Information relating to flood behaviour is discussed in terms of basic image presentation as well as image processing. A cost analysis of the purchase and use of all materials is provided. Conclusions of the research indicate that LANDSAT print material may provide information suitable for regression analysis at levels of accuracy as great as those of topographic maps, that the differing information sources are uniquely applicable and that accurate estimates of flood volumes may be determined even by post flood imagery.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This exploratory study is concerned with the integrated appraisal of multi-storey dwelling blocks which incorporate large concrete panel systems (LPS). The first step was to look at U.K. multi-storey dwelling stock in general, and under the management of Birmingham City Council in particular. The information has been taken from the databases of three departments in the City of Birmingham, and rearranged in a new database using a suite of PC software called `PROXIMA' for clarity and analysis. One hundred of their stock were built large concrete panel system. Thirteen LPS blocks were chosen for the purpose of this study as case-studies depending mainly on the height and age factors of the block. A new integrated appraisal technique has been created for the LPS dwelling blocks, which takes into account the most physical and social factors affecting the condition and acceptability of these blocks. This appraisal technique is built up in a hierarchical form moving from the general approach to particular elements (a tree model). It comprises two main approaches; physical and social. In the physical approach, the building is viewed as a series of manageable elements and sub-elements to cover every single physical or environmental factor of the block, in which the condition of the block is analysed. A quality score system has been developed which depends mainly on the qualitative and quantitative conditions of each category in the appraisal tree model, and leads to physical ranking order of the study blocks. In the social appraisal approach, the residents' satisfaction and attitude toward their multi-storey dwelling block was analysed in relation to: a. biographical and housing related characteristics; and b. social, physical and environmental factors associated with this sort of dwelling, block and estate in general.The random sample consisted of 268 residents living in the 13 case study blocks. Data collected was analysed using frequency counts, percentages, means, standard deviations, Kendall's tue, r-correlation coefficients, t-test, analysis of variance (ANOVA) and multiple regression analysis. The analysis showed a marginally positive satisfaction and attitude towards living in the block. The five most significant factors associated with the residents' satisfaction and attitude in descending order were: the estate, in general; the service categories in the block, including heating system and lift services; vandalism; the neighbours; and the security system of the block. An important attribute of this method, is that it is relatively inexpensive to implement, especially when compared to alternatives adopted by some local authorities and the BRE. It is designed to save time, money and effort, to aid decision making, and to provide ranked priority to the multi-storey dwelling stock, in addition to many other advantages. A series of solution options to the problems of the block was sought for selection and testing before implementation. The traditional solutions have usually resulted in either demolition or costly physical maintenance and social improvement of the blocks. However, a new solution has now emerged, which is particularly suited to structurally sound units. The solution of `re-cycling' might incorporate the reuse of an entire block or part of it, by removing panels, slabs and so forth from the upper floors in order to reconstruct them as low-rise accommodations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Decomposition of domestic wastes in an anaerobic environment results in the production of landfill gas. Public concern about landfill disposal and particularly the production of landfill gas has been heightened over the past decade. This has been due in large to the increased quantities of gas being generated as a result of modern disposal techniques, and also to their increasing effect on modern urban developments. In order to avert diasters, effective means of preventing gas migration are required. This, in turn requires accurate detection and monitoring of gas in the subsurface. Point sampling techniques have many drawbacks, and accurate measurement of gas is difficult. Some of the disadvantages of these techniques could be overcome by assessing the impact of gas on biological systems. This research explores the effects of landfill gas on plants, and hence on the spectral response of vegetation canopies. Examination of the landfill gas/vegetation relationship is covered, both by review of the literature and statistical analysis of field data. The work showed that, although vegetation health was related to landfill gas, it was not possible to define a simple correlation. In the landfill environment, contribution from other variables, such as soil characteristics, frequently confused the relationship. Two sites are investigated in detail, the sites contrasting in terms of the data available, site conditions, and the degree of damage to vegetation. Gas migration at the Panshanger site was dominantly upwards, affecting crops being grown on the landfill cap. The injury was expressed as an overall decline in plant health. Discriminant analysis was used to account for the variations in plant health, and hence the differences in spectral response of the crop canopy, using a combination of soil and gas variables. Damage to both woodland and crops at the Ware site was severe, and could be easily related to the presence of gas. Air photographs, aerial video, and airborne thematic mapper data were used to identify damage to vegetation, and relate this to soil type. The utility of different sensors for this type of application is assessed, and possible improvements that could lead to more widespread use are identified. The situations in which remote sensing data could be combined with ground survey are identified. In addition, a possible methodology for integrating the two approaches is suggested.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Guest editorial Ali Emrouznejad is a Senior Lecturer at the Aston Business School in Birmingham, UK. His areas of research interest include performance measurement and management, efficiency and productivity analysis as well as data mining. He has published widely in various international journals. He is an Associate Editor of IMA Journal of Management Mathematics and Guest Editor to several special issues of journals including Journal of Operational Research Society, Annals of Operations Research, Journal of Medical Systems, and International Journal of Energy Management Sector. He is in the editorial board of several international journals and co-founder of Performance Improvement Management Software. William Ho is a Senior Lecturer at the Aston University Business School. Before joining Aston in 2005, he had worked as a Research Associate in the Department of Industrial and Systems Engineering at the Hong Kong Polytechnic University. His research interests include supply chain management, production and operations management, and operations research. He has published extensively in various international journals like Computers & Operations Research, Engineering Applications of Artificial Intelligence, European Journal of Operational Research, Expert Systems with Applications, International Journal of Production Economics, International Journal of Production Research, Supply Chain Management: An International Journal, and so on. His first authored book was published in 2006. He is an Editorial Board member of the International Journal of Advanced Manufacturing Technology and an Associate Editor of the OR Insight Journal. Currently, he is a Scholar of the Advanced Institute of Management Research. Uses of frontier efficiency methodologies and multi-criteria decision making for performance measurement in the energy sector This special issue aims to focus on holistic, applied research on performance measurement in energy sector management and for publication of relevant applied research to bridge the gap between industry and academia. After a rigorous refereeing process, seven papers were included in this special issue. The volume opens with five data envelopment analysis (DEA)-based papers. Wu et al. apply the DEA-based Malmquist index to evaluate the changes in relative efficiency and the total factor productivity of coal-fired electricity generation of 30 Chinese administrative regions from 1999 to 2007. Factors considered in the model include fuel consumption, labor, capital, sulphur dioxide emissions, and electricity generated. The authors reveal that the east provinces were relatively and technically more efficient, whereas the west provinces had the highest growth rate in the period studied. Ioannis E. Tsolas applies the DEA approach to assess the performance of Greek fossil fuel-fired power stations taking undesirable outputs into consideration, such as carbon dioxide and sulphur dioxide emissions. In addition, the bootstrapping approach is deployed to address the uncertainty surrounding DEA point estimates, and provide bias-corrected estimations and confidence intervals for the point estimates. The author revealed from the sample that the non-lignite-fired stations are on an average more efficient than the lignite-fired stations. Maethee Mekaroonreung and Andrew L. Johnson compare the relative performance of three DEA-based measures, which estimate production frontiers and evaluate the relative efficiency of 113 US petroleum refineries while considering undesirable outputs. Three inputs (capital, energy consumption, and crude oil consumption), two desirable outputs (gasoline and distillate generation), and an undesirable output (toxic release) are considered in the DEA models. The authors discover that refineries in the Rocky Mountain region performed the best, and about 60 percent of oil refineries in the sample could improve their efficiencies further. H. Omrani, A. Azadeh, S. F. Ghaderi, and S. Abdollahzadeh presented an integrated approach, combining DEA, corrected ordinary least squares (COLS), and principal component analysis (PCA) methods, to calculate the relative efficiency scores of 26 Iranian electricity distribution units from 2003 to 2006. Specifically, both DEA and COLS are used to check three internal consistency conditions, whereas PCA is used to verify and validate the final ranking results of either DEA (consistency) or DEA-COLS (non-consistency). Three inputs (network length, transformer capacity, and number of employees) and two outputs (number of customers and total electricity sales) are considered in the model. Virendra Ajodhia applied three DEA-based models to evaluate the relative performance of 20 electricity distribution firms from the UK and the Netherlands. The first model is a traditional DEA model for analyzing cost-only efficiency. The second model includes (inverse) quality by modelling total customer minutes lost as an input data. The third model is based on the idea of using total social costs, including the firm’s private costs and the interruption costs incurred by consumers, as an input. Both energy-delivered and number of consumers are treated as the outputs in the models. After five DEA papers, Stelios Grafakos, Alexandros Flamos, Vlasis Oikonomou, and D. Zevgolis presented a multiple criteria analysis weighting approach to evaluate the energy and climate policy. The proposed approach is akin to the analytic hierarchy process, which consists of pairwise comparisons, consistency verification, and criteria prioritization. In the approach, stakeholders and experts in the energy policy field are incorporated in the evaluation process by providing an interactive mean with verbal, numerical, and visual representation of their preferences. A total of 14 evaluation criteria were considered and classified into four objectives, such as climate change mitigation, energy effectiveness, socioeconomic, and competitiveness and technology. Finally, Borge Hess applied the stochastic frontier analysis approach to analyze the impact of various business strategies, including acquisition, holding structures, and joint ventures, on a firm’s efficiency within a sample of 47 natural gas transmission pipelines in the USA from 1996 to 2005. The author finds that there were no significant changes in the firm’s efficiency by an acquisition, and there is a weak evidence for efficiency improvements caused by the new shareholder. Besides, the author discovers that parent companies appear not to influence a subsidiary’s efficiency positively. In addition, the analysis shows a negative impact of a joint venture on technical efficiency of the pipeline company. To conclude, we are grateful to all the authors for their contribution, and all the reviewers for their constructive comments, which made this special issue possible. We hope that this issue would contribute significantly to performance improvement of the energy sector.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The ageing process is strongly influenced by nutrient balance, such that modest calorie restriction (CR) extends lifespan in mammals. Irisin, a newly described hormone released from skeletal muscles after exercise, may induce CR-like effects by increasing adipose tissue energy expenditure. Using telomere length as a marker of ageing, this study investigates associations between body composition, plasma irisin levels and peripheral blood mononuclear cell telomere length in healthy, non-obese individuals. Segmental body composition (by bioimpedance), telomere length and plasma irisin levels were assessed in 81 healthy individuals (age 43∈±∈15.8 years, BMI 24.3∈±∈2.9 kg/m2). Data showed significant correlations between log-transformed relative telomere length and the following: age (p∈<∈0.001), height (p∈=∈0.045), total body fat percentage (p∈=∈0.031), abdominal fat percentage (p∈=∈0.038) , visceral fat level (p∈<∈0.001), plasma leptin (p∈=∈0.029) and plasma irisin (p∈=∈0.011), respectively. Multiple regression analysis using backward elimination revealed that relative telomere length can be predicted by age (b∈=∈-0.00735, p∈=∈0.001) and plasma irisin levels (b∈=∈0.04527, p∈=∈0.021). These data support the view that irisin may have a role in the modulation of both energy balance and the ageing process. © 2014 The Author(s).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With business incubators deemed as a potent infrastructural element for entrepreneurship development, business incubation management practice and performance have received widespread attention. However, despite this surge of interest, scholars have questioned the extent to which business incubation delivers added value. Thus, there is a growing awareness among researchers, practitioners and policy makers of the need for more rigorous evaluation of the business incubation output performance. Aligned to this is an increasing demand for benchmarking business incubation input/process performance and highlighting best practice. This paper offers a business incubation assessment framework, which considers input/process and output performance domains with relevant indicators. This tool adds value on different levels. It has been developed in collaboration with practitioners and industry experts and therefore it would be relevant and useful to business incubation managers. Once a large enough database of completed questionnaires has been populated on an online platform managed by a coordinating mechanism, such as a business incubation membership association, business incubator managers can reflect on their practices by using this assessment framework to learn their relative position vis-à-vis their peers against each domain. This will enable them to align with best practice in this field. Beyond implications for business incubation management practice, this performance assessment framework would also be useful to researchers and policy makers concerned with business incubation management practice and impact. Future large-scale research could test for construct validity and reliability. Also, discriminant analysis could help link input and process indicators with output measures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE: To investigate laboratory evidence of abnormal angiogenesis, hemorheologic factors, endothelial damage/dysfunction, and age-related macular degeneration (ARMD). DESIGN: Comparative cross-sectional study. PARTICIPANTS: We studied 78 subjects (26 men and 52 women; mean age 74 years; standard deviation [SD] 9.0) with ARMD attending a specialist referral clinic. Subjects were compared with 25 healthy controls (mean age, 71 years; SD, 11). INTERVENTION AND OUTCOME MEASURES: Levels of vascular endothelial growth factor (VEGF, an index of angiogenesis), hemorheologic factors (plasma viscosity, hematocrit, white cell count, hemoglobin, platelets), fibrinogen (an index of rheology and hemostasis), and von Willebrand factor (a marker of endothelial dysfunction) were measured. RESULTS: Median plasma VEGF (225 vs. 195 pg/ml, P = 0.019) and mean von Willebrand factor (124 vs. 99 IU/dl, P = 0.0004) were greater in ARMD subjects than the controls. Mean plasma fibrinogen and plasma viscosity levels were also higher in the subjects (both P < 0.0001). There were no significant differences in other indices between cases and controls. When "dry" (drusen, atrophy, n = 28) and "exudative" (n = 50) ARMD subjects were compared, there was no significant differences in VEGF, fibrinogen, viscosity, or von Willebrand factor levels. There were no significant correlations between the measured parameters. Stepwise multiple regression analysis did not demonstrate any significant clinical predictors (age, gender, smoking, body mass index, history of vascular disease, or hypertension) for plasma VEGF or fibrinogen levels, although smoking status was a predictor of plasma von Willebrand factor levels (P < 0.05). CONCLUSIONS: This study suggests an association between markers of angiogenesis (VEGF), hemorheologic factors, hemostasis, endothelial dysfunction, and ARMD. The interaction between abnormal angiogenesis and the components of Virchow's triad for thrombogenesis may in part contribute to the pathogenesis of ARMD.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Allergy is a form of hypersensitivity to normally innocuous substances, such as dust, pollen, foods or drugs. Allergens are small antigens that commonly provoke an IgE antibody response. There are two types of bioinformatics-based allergen prediction. The first approach follows FAO/WHO Codex alimentarius guidelines and searches for sequence similarity. The second approach is based on identifying conserved allergenicity-related linear motifs. Both approaches assume that allergenicity is a linearly coded property. In the present study, we applied ACC pre-processing to sets of known allergens, developing alignment-independent models for allergen recognition based on the main chemical properties of amino acid sequences.Results: A set of 684 food, 1,156 inhalant and 555 toxin allergens was collected from several databases. A set of non-allergens from the same species were selected to mirror the allergen set. The amino acids in the protein sequences were described by three z-descriptors (z1, z2 and z3) and by auto- and cross-covariance (ACC) transformation were converted into uniform vectors. Each protein was presented as a vector of 45 variables. Five machine learning methods for classification were applied in the study to derive models for allergen prediction. The methods were: discriminant analysis by partial least squares (DA-PLS), logistic regression (LR), decision tree (DT), naïve Bayes (NB) and k nearest neighbours (kNN). The best performing model was derived by kNN at k = 3. It was optimized, cross-validated and implemented in a server named AllerTOP, freely accessible at http://www.pharmfac.net/allertop. AllerTOP also predicts the most probable route of exposure. In comparison to other servers for allergen prediction, AllerTOP outperforms them with 94% sensitivity.Conclusions: AllerTOP is the first alignment-free server for in silico prediction of allergens based on the main physicochemical properties of proteins. Significantly, as well allergenicity AllerTOP is able to predict the route of allergen exposure: food, inhalant or toxin. © 2013 Dimitrov et al.; licensee BioMed Central Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mainstream gentrification research predominantly examines experiences and motivations of the middle-class gentrifier groups, while overlooking experiences of non-gentrifying groups including the impact of in situ local processes on gentrification itself. In this paper, I discuss gentrification, neighbourhood belonging and spatial distribution of class in Istanbul by examining patterns of belonging both of gentrifiers and non-gentrifying groups in historic neighbourhoods of the Golden Horn/Halic. I use multiple correspondence analysis (MCA), a methodology rarely used in gentrification research, to explore social and symbolic borders between these two groups. I show how gentrification leads to spatial clustering by creating exclusionary practices and eroding social cohesion, and illuminate divisions that are inscribed into the physical space of the neighbourhood.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aims: Obesity and Type 2 diabetes are associated with accelerated ageing. The underlying mechanisms behind this, however, are poorly understood. In this study, we investigated the association between circulating irisin - a novel my okine involved in energy regulation - and telomere length (TL) (a marker of aging) in healthy individuals and individuals with Type 2 diabetes. Methods: Eighty-two healthy people and 67 subjects with Type 2 diabetes were recruited to this cross-sectional study. Anthropometric measurements including body composition measured by biompedance were recorded. Plasma irisin was measured by ELISA on a fasted blood sample. Relative TL was determined using real-time PCR. Associations between anthropometric measures and irisin and TL were explored using Pearson’s bivariate correlations. Multiple regression was used to explore all the significant predictors of TL using backward elimination. Results: In healthy individuals chronological age was a strong negative predictor of TL (=0.552, p < 0.001). Multiple regression analysis using backward elimination (excluding age) revealed the greater relative TL could be predicted by greater total muscle mass(b = 0.046, p = 0.001), less visceral fat (b = =0.183, p < 0.001)and higher plasma irisin levels (b = 0.01, p = 0.027). There were no significant associations between chronological age, plasmairisin, anthropometric measures and TL in patients with Type 2diabetes (p > 0.1). Conclusion: These data support the view that body composition and plasma irisin may have a role in modulation of energy balance and the aging process in healthy individuals. This relationship is altered in individuals with Type 2 diabetes.