963 resultados para Link variable method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

River runoff is an essential climate variable as it is directly linked to the terrestrial water balance and controls a wide range of climatological and ecological processes. Despite its scientific and societal importance, there are to date no pan-European observation-based runoff estimates available. Here we employ a recently developed methodology to estimate monthly runoff rates on regular spatial grid in Europe. For this we first collect an unprecedented collection of river flow observations, combining information from three distinct data bases. Observed monthly runoff rates are first tested for homogeneity and then related to gridded atmospheric variables (E-OBS version 11) using machine learning. The resulting statistical model is then used to estimate monthly runoff rates (December 1950-December 2014) on a 0.5° × 0.5° grid. The performance of the newly derived runoff estimates is assessed in terms of cross validation. The paper closes with example applications, illustrating the potential of the new runoff estimates for climatological assessments and drought monitoring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Body size is a key determinant of metabolic rate, but logistical constraints have led to a paucity of energetics measurements from large water-breathing animals. As a result, estimating energy requirements of large fish generally relies on extrapolation of metabolic rate from individuals of lower body mass using allometric relationships that are notoriously variable. Swim-tunnel respirometry is the ‘gold standard’ for measuring active metabolic rates in water-breathing animals, yet previous data are entirely derived from body masses <10 kg – at least one order of magnitude lower than the body masses of many top-order marine predators. Here, we describe the design and testing of a new method for measuring metabolic rates of large water-breathing animals: a c. 26 000 L seagoing ‘mega-flume’ swim-tunnel respirometer. We measured the swimming metabolic rate of a 2·1-m, 36-kg zebra shark Stegostoma fasciatum within this new mega-flume and compared the results to data we collected from other S. fasciatum (3·8–47·7 kg body mass) swimming in static respirometers and previously published measurements of active metabolic rate measurements from other shark species. The mega-flume performed well during initial tests, with intra- and interspecific comparisons suggesting accurate metabolic rate measurements can be obtained with this new tool. Inclusion of our data showed that the scaling exponent of active metabolic rate with mass for sharks ranging from 0·13 to 47·7 kg was 0·79; a similar value to previous estimates for resting metabolic rates in smaller fishes. We describe the operation and usefulness of this new method in the context of our current uncertainties surrounding energy requirements of large water-breathing animals. We also highlight the sensitivity of mass-extrapolated energetic estimates in large aquatic animals and discuss the consequences for predicting ecosystem impacts such as trophic cascades.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Body size is a key determinant of metabolic rate, but logistical constraints have led to a paucity of energetics measurements from large water-breathing animals. As a result, estimating energy requirements of large fish generally relies on extrapolation of metabolic rate from individuals of lower body mass using allometric relationships that are notoriously variable. Swim-tunnel respirometry is the ‘gold standard’ for measuring active metabolic rates in water-breathing animals, yet previous data are entirely derived from body masses <10 kg – at least one order of magnitude lower than the body masses of many top-order marine predators. Here, we describe the design and testing of a new method for measuring metabolic rates of large water-breathing animals: a c. 26 000 L seagoing ‘mega-flume’ swim-tunnel respirometer. We measured the swimming metabolic rate of a 2·1-m, 36-kg zebra shark Stegostoma fasciatum within this new mega-flume and compared the results to data we collected from other S. fasciatum (3·8–47·7 kg body mass) swimming in static respirometers and previously published measurements of active metabolic rate measurements from other shark species. The mega-flume performed well during initial tests, with intra- and interspecific comparisons suggesting accurate metabolic rate measurements can be obtained with this new tool. Inclusion of our data showed that the scaling exponent of active metabolic rate with mass for sharks ranging from 0·13 to 47·7 kg was 0·79; a similar value to previous estimates for resting metabolic rates in smaller fishes. We describe the operation and usefulness of this new method in the context of our current uncertainties surrounding energy requirements of large water-breathing animals. We also highlight the sensitivity of mass-extrapolated energetic estimates in large aquatic animals and discuss the consequences for predicting ecosystem impacts such as trophic cascades.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A search query, being a very concise grounding of user intent, could potentially have many possible interpretations. Search engines hedge their bets by diversifying top results to cover multiple such possibilities so that the user is likely to be satisfied, whatever be her intended interpretation. Diversified Query Expansion is the problem of diversifying query expansion suggestions, so that the user can specialize the query to better suit her intent, even before perusing search results. We propose a method, Select-Link-Rank, that exploits semantic information from Wikipedia to generate diversified query expansions. SLR does collective processing of terms and Wikipedia entities in an integrated framework, simultaneously diversifying query expansions and entity recommendations. SLR starts with selecting informative terms from search results of the initial query, links them to Wikipedia entities, performs a diversity-conscious entity scoring and transfers such scoring to the term space to arrive at query expansion suggestions. Through an extensive empirical analysis and user study, we show that our method outperforms the state-of-the-art diversified query expansion and diversified entity recommendation techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Surface flow types (SFT) are advocated as ecologically relevant hydraulic units, often mapped visually from the bankside to characterise rapidly the physical habitat of rivers. SFT mapping is simple, non-invasive and cost-efficient. However, it is also qualitative, subjective and plagued by difficulties in recording accurately the spatial extent of SFT units. Quantitative validation of the underlying physical habitat parameters is often lacking, and does not consistently differentiate between SFTs. Here, we investigate explicitly the accuracy, reliability and statistical separability of traditionally mapped SFTs as indicators of physical habitat, using independent, hydraulic and topographic data collected during three surveys of a c. 50m reach of the River Arrow, Warwickshire, England. We also explore the potential of a novel remote sensing approach, comprising a small unmanned aerial system (sUAS) and Structure-from-Motion photogrammetry (SfM), as an alternative method of physical habitat characterisation. Our key findings indicate that SFT mapping accuracy is highly variable, with overall mapping accuracy not exceeding 74%. Results from analysis of similarity (ANOSIM) tests found that strong differences did not exist between all SFT pairs. This leads us to question the suitability of SFTs for characterising physical habitat for river science and management applications. In contrast, the sUAS-SfM approach provided high resolution, spatially continuous, spatially explicit, quantitative measurements of water depth and point cloud roughness at the microscale (spatial scales ≤1m). Such data are acquired rapidly, inexpensively, and provide new opportunities for examining the heterogeneity of physical habitat over a range of spatial and temporal scales. Whilst continued refinement of the sUAS-SfM approach is required, we propose that this method offers an opportunity to move away from broad, mesoscale classifications of physical habitat (spatial scales 10-100m), and towards continuous, quantitative measurements of the continuum of hydraulic and geomorphic conditions which actually exists at the microscale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ma thèse s’intéresse aux politiques de santé conçues pour encourager l’offre de services de santé. L’accessibilité aux services de santé est un problème majeur qui mine le système de santé de la plupart des pays industrialisés. Au Québec, le temps médian d’attente entre une recommandation du médecin généraliste et un rendez-vous avec un médecin spécialiste était de 7,3 semaines en 2012, contre 2,9 semaines en 1993, et ceci malgré l’augmentation du nombre de médecins sur cette même période. Pour les décideurs politiques observant l’augmentation du temps d’attente pour des soins de santé, il est important de comprendre la structure de l’offre de travail des médecins et comment celle-ci affecte l’offre des services de santé. Dans ce contexte, je considère deux principales politiques. En premier lieu, j’estime comment les médecins réagissent aux incitatifs monétaires et j’utilise les paramètres estimés pour examiner comment les politiques de compensation peuvent être utilisées pour déterminer l’offre de services de santé de court terme. En second lieu, j’examine comment la productivité des médecins est affectée par leur expérience, à travers le mécanisme du "learning-by-doing", et j’utilise les paramètres estimés pour trouver le nombre de médecins inexpérimentés que l’on doit recruter pour remplacer un médecin expérimenté qui va à la retraite afin de garder l’offre des services de santé constant. Ma thèse développe et applique des méthodes économique et statistique afin de mesurer la réaction des médecins face aux incitatifs monétaires et estimer leur profil de productivité (en mesurant la variation de la productivité des médecins tout le long de leur carrière) en utilisant à la fois des données de panel sur les médecins québécois, provenant d’enquêtes et de l’administration. Les données contiennent des informations sur l’offre de travail de chaque médecin, les différents types de services offerts ainsi que leurs prix. Ces données couvrent une période pendant laquelle le gouvernement du Québec a changé les prix relatifs des services de santé. J’ai utilisé une approche basée sur la modélisation pour développer et estimer un modèle structurel d’offre de travail en permettant au médecin d’être multitâche. Dans mon modèle les médecins choisissent le nombre d’heures travaillées ainsi que l’allocation de ces heures à travers les différents services offerts, de plus les prix des services leurs sont imposés par le gouvernement. Le modèle génère une équation de revenu qui dépend des heures travaillées et d’un indice de prix représentant le rendement marginal des heures travaillées lorsque celles-ci sont allouées de façon optimale à travers les différents services. L’indice de prix dépend des prix des services offerts et des paramètres de la technologie de production des services qui déterminent comment les médecins réagissent aux changements des prix relatifs. J’ai appliqué le modèle aux données de panel sur la rémunération des médecins au Québec fusionnées à celles sur l’utilisation du temps de ces mêmes médecins. J’utilise le modèle pour examiner deux dimensions de l’offre des services de santé. En premierlieu, j’analyse l’utilisation des incitatifs monétaires pour amener les médecins à modifier leur production des différents services. Bien que les études antérieures ont souvent cherché à comparer le comportement des médecins à travers les différents systèmes de compensation,il y a relativement peu d’informations sur comment les médecins réagissent aux changementsdes prix des services de santé. Des débats actuels dans les milieux de politiques de santé au Canada se sont intéressés à l’importance des effets de revenu dans la détermination de la réponse des médecins face à l’augmentation des prix des services de santé. Mon travail contribue à alimenter ce débat en identifiant et en estimant les effets de substitution et de revenu résultant des changements des prix relatifs des services de santé. En second lieu, j’analyse comment l’expérience affecte la productivité des médecins. Cela a une importante implication sur le recrutement des médecins afin de satisfaire la demande croissante due à une population vieillissante, en particulier lorsque les médecins les plus expérimentés (les plus productifs) vont à la retraite. Dans le premier essai, j’ai estimé la fonction de revenu conditionnellement aux heures travaillées, en utilisant la méthode des variables instrumentales afin de contrôler pour une éventuelle endogeneité des heures travaillées. Comme instruments j’ai utilisé les variables indicatrices des âges des médecins, le taux marginal de taxation, le rendement sur le marché boursier, le carré et le cube de ce rendement. Je montre que cela donne la borne inférieure de l’élasticité-prix direct, permettant ainsi de tester si les médecins réagissent aux incitatifs monétaires. Les résultats montrent que les bornes inférieures des élasticités-prix de l’offre de services sont significativement positives, suggérant que les médecins répondent aux incitatifs. Un changement des prix relatifs conduit les médecins à allouer plus d’heures de travail au service dont le prix a augmenté. Dans le deuxième essai, j’estime le modèle en entier, de façon inconditionnelle aux heures travaillées, en analysant les variations des heures travaillées par les médecins, le volume des services offerts et le revenu des médecins. Pour ce faire, j’ai utilisé l’estimateur de la méthode des moments simulés. Les résultats montrent que les élasticités-prix direct de substitution sont élevées et significativement positives, représentant une tendance des médecins à accroitre le volume du service dont le prix a connu la plus forte augmentation. Les élasticitésprix croisées de substitution sont également élevées mais négatives. Par ailleurs, il existe un effet de revenu associé à l’augmentation des tarifs. J’ai utilisé les paramètres estimés du modèle structurel pour simuler une hausse générale de prix des services de 32%. Les résultats montrent que les médecins devraient réduire le nombre total d’heures travaillées (élasticité moyenne de -0,02) ainsi que les heures cliniques travaillées (élasticité moyenne de -0.07). Ils devraient aussi réduire le volume de services offerts (élasticité moyenne de -0.05). Troisièmement, j’ai exploité le lien naturel existant entre le revenu d’un médecin payé à l’acte et sa productivité afin d’établir le profil de productivité des médecins. Pour ce faire, j’ai modifié la spécification du modèle pour prendre en compte la relation entre la productivité d’un médecin et son expérience. J’estime l’équation de revenu en utilisant des données de panel asymétrique et en corrigeant le caractère non-aléatoire des observations manquantes à l’aide d’un modèle de sélection. Les résultats suggèrent que le profil de productivité est une fonction croissante et concave de l’expérience. Par ailleurs, ce profil est robuste à l’utilisation de l’expérience effective (la quantité de service produit) comme variable de contrôle et aussi à la suppression d’hypothèse paramétrique. De plus, si l’expérience du médecin augmente d’une année, il augmente la production de services de 1003 dollar CAN. J’ai utilisé les paramètres estimés du modèle pour calculer le ratio de remplacement : le nombre de médecins inexpérimentés qu’il faut pour remplacer un médecin expérimenté. Ce ratio de remplacement est de 1,2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background For decades film has proved to be a powerful form of communication. Whether produced as entertainment, art or documentary, films have the capacity to inform and move us. Films are a highly attractive teaching instrument and an appropriate teaching method in health education. It is a valuable tool for studying situations most transcendental to human beings such as pain, disease and death. Objectives The objectives were to determine how this helps students engage with their role as health care professionals; to determine how they view the personal experience of illness, disease, disability or death; and to determine how this may impact upon their provision of patient care. Design, Setting and Participants The project was underpinned by the film selection determined by considerate review, intensive scrutiny, contemplation and discourse by the research team. 7 films were selected, ranging from animation; foreign, documentary, biopic and Hollywood drama. Each film was shown discretely, in an acoustic lecture theatre projected onto a large screen to pre-registration student nurses (adult, child and mental health) across each year of study from different cohorts (n = 49). Method A mixed qualitative method approach consisted of audio-recorded 5-minute reactions post film screening; coded questionnaires; and focus group. Findings were drawn from the impact of the films through thematic analysis of data sets and subjective text condensation categorised as: new insights looking through patient eyes; evoking emotion in student nurses; spiritual care; going to the moves to learn about the patient experience; self discovery through films; using films to link theory to practice. Results Deeper learning through film as a powerful medium was identified in meeting the objectives of the study. Integration of film into pre registration curriculum, pedagogy, teaching and learning is recommended. Conclusion The teaching potential of film stems from the visual process linked to human emotion and experience. Its impact has the power to not only help in learning the values that underpin nursing, but also for respecting the patient experience of disease, disability, death and its reality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two complementary de facto standards for the publication of electronic documents are HTML on theWorldWideWeb and Adobe s PDF (Portable Document Format) language for use with Acrobat viewers. Both these formats provide support for hypertext features to be embedded within documents. We present a method, which allows links and other hypertext material to be kept in an abstract form in separate link databases. The links can then be interpreted or compiled at any stage and applied, in the correct format to some specific representation such as HTML or PDF. This approach is of great value in keeping hyperlinks relevant, up-to-date and in a form which is independent of the finally delivered electronic document format. Four models are discussed for allowing publishers to insert links into documents at a late stage. The techniques discussed have been implemented using a combination of Acrobat plug-ins, Web servers and Web browsers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research is based on a numerical model for forecasting the three-dimensional behavior of (sea) water motion due to the effect of a variable wind velocity. The results obtained are then analyzed and compared with observation. This model is based on the equations that overcome the current and distribution of temperature by applying the method of finite difference with assuming Δx, Δy as constant and Δz, variable. The model is based on the momentum equation, continuity equation and thermodynamic energy equation and tension at the surface and middle layers and bottom stress. The horizontal and vertical eddy viscosity and thermal diffusivity coefficients we used in accordance with that of the Bennet on Outario Lake (1977). Considering the Caspian Sea dimension in numerical model the Coriolis parameter used with β effects and the approximation Boussines have been used. For the program controlling some simple experiment with boundary condition similar to that of the Caspian Sea have been done. For modeling the Caspian Sea the grid of the field was done as follows: At horizontal surface grid size is 10×10km extension and at vertical in 10 layers with varying thickness from surface to bed respectively as: 5, 10, 20, 3, 50, 100, 150, 200, 25, 500 and higher. The data of wind as velocity، direction and temperature of water related to 15th September 1995 at 6،12 and 18 o’clock were obtained from synoptic station at the Caspian Sea shore and the research marine of Haji Alief. The information concerning shore wind was measured and by the method of SPM (shore protection manual) was transferred to far shore winds through interpolation and by use of inverse square distance of position distribution of the wind velocity at the Caspian surface field was obtained. The model has been evaluated according to the reports and observations. Through studying the position of the current in different layers، the velocity in the cross section in the northern، southern and the middle layers، will be discussed. The results reveal the presence of the circulation cells in the three above mentioned areas. The circulation with depth is reduced too. The results obtained through the numerical solution of the temperature equation have been compared with the observation. The temperature change in different layers in cross section illustrates the relative accordance of the model mentioned.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new 'Danger Theory' (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of 'grounding' the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stakeholder engagement is important for successful management of natural resources, both to make effective decisions and to obtain support. However, in the context of coastal management, questions remain unanswered on how to effectively link decisions made at the catchment level with objectives for marine biodiversity and fisheries productivity. Moreover, there is much uncertainty on how to best elicit community input in a rigorous manner that supports management decisions. A decision support process is described that uses the adaptive management loop as its basis to elicit management objectives, priorities and management options using two case studies in the Great Barrier Reef, Australia. The approach described is then generalised for international interest. A hierarchical engagement model of local stakeholders, regional and senior managers is used. The result is a semi-quantitative generic elicitation framework that ultimately provides a prioritised list of management options in the context of clearly articulated management objectives that has widespread application for coastal communities worldwide. The case studies show that demand for local input and regional management is high, but local influences affect the relative success of both engagement processes and uptake by managers. Differences between case study outcomes highlight the importance of discussing objectives prior to suggesting management actions, and avoiding or minimising conflicts at the early stages of the process. Strong contributors to success are a) the provision of local information to the community group, and b) the early inclusion of senior managers and influencers in the group to ensure the intellectual and time investment is not compromised at the final stages of the process. The project has uncovered a conundrum in the significant gap between the way managers perceive their management actions and outcomes, and community's perception of the effectiveness (and wisdom) of these same management actions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aflatoxins are one kind of fungal toxins produced by species of toxigenic Aspergillus (A. flavus and A. parasiticus) and in other words they are secondary metabolites which are considered as one of the threatening factors of food consumer's health. In this research 96 samples of cold-water cultural fish feed, rainbow trout, during the seasons of spring and summer of 2007 (every fifteenth of the month) were randomized (by simple and stratified random) to determine: 1. The prevalence rate of aflatoxigenic species of Aspergillus in stored feed of cold-water cultural fish in West Azarbayjan cultural fish farms in both seasons (spring and summer); 2. The residues of total aflatoxin in stored feed of fish in cultural fish farms of West Azarbayjan in both seasons by ELISA method; and 3. The residues of that toxin in feed produced in aquatic feed factories in Tehran and West Azarbyjan provinces with the same method. In order to study prevalence rate of toxigenic species of Aspergillus, pour-plate culture method by general medium such as Malt Extract Agar (M.E.A.) and Sabouraud-Dextrose Agar (S.D.A.) and by standard No.997 of Iranian Standard Institute were used. The produced colonies were examined microscopically. To determine the aflatoxins residues, ELISA method using Agra-Quant kit of Romer Lab company, were applied. The results of this survey indicated that only 8.3% of the samples were infected by A. flavus. A. parasiticus was not observed. There were no significant differences between the prevalence rate of AFT and seasons/months, either (P<0.05). Evaluating mean of aflatoxin rate showed that the rates of this variable are lower than the tolerance levels designated by the joint FAO/WHO expert committee (The mean of AFT in all data was lower than 11 ppb). Furthermore, mean of total AFT residues rates of stored feed of various cultural center of West Azarbayjan and Tehran factories were comparable in spring and summer, and no significant differences were observed (P<0.05). But there were significant differences between the total aflatoxin rates in the feed of West Azarbayjan factory and spring and summer (P<0.05), and AFT residues in spring (8.6 ppb) were higher than summer (6.1 ppb). Prevalence rates of AFT in Tehran feed factories (9.2 ppb) are higher than W. Azarbayjan (7.4 ppb). In other words, location was considered as a decisive factor in total AFT rates of samples. Moreover, the results indicated that there was significant difference between total aflatoxin rates of feed and cultural centers (P<0.05). The mean of AFT rates in embankment dam cultural fish farms (6.75 ppb) and multi-functions cultural fish farms (6.25 ppb) was higher than individual cultural pond (4.67 ppb). In conclusion, the finally results of this survey indicated that the lower rates of Aspergillus is not effective on the presence of total aflatoxin rates in trout feed. Due to low levels of aflatoxin rates (lower than 20 ppb), the produced feed of cold-cultural fishes, Rainbow Trout, in Tehran and West Azarbayjan provinces, in spring and summer of 2007, were safe and healthy both for fish and their consumers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new 'Danger Theory' (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of 'grounding' the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Body composition is affected by diseases, and affects responses to medical treatments, dosage of medicines, etc., while an abnormal body composition contributes to the causation of many chronic diseases. While we have reliable biochemical tests for certain nutritional parameters of body composition, such as iron or iodine status, and we have harnessed nuclear physics to estimate the body’s content of trace elements, the very basic quantification of body fat content and muscle mass remains highly problematic. Both body fat and muscle mass are vitally important, as they have opposing influences on chronic disease, but they have seldom been estimated as part of population health surveillance. Instead, most national surveys have merely reported BMI and waist, or sometimes the waist/hip ratio; these indices are convenient but do not have any specific biological meaning. Anthropometry offers a practical and inexpensive method for muscle and fat estimation in clinical and epidemiological settings; however, its use is imperfect due to many limitations, such as a shortage of reference data, misuse of terminology, unclear assumptions, and the absence of properly validated anthropometric equations. To date, anthropometric methods are not sensitive enough to detect muscle and fat loss. Aims: The aim of this thesis is to estimate Adipose/fat and muscle mass in health disease and during weight loss through; 1. evaluating and critiquing the literature, to identify the best-published prediction equations for adipose/fat and muscle mass estimation; 2. to derive and validate adipose tissue and muscle mass prediction equations; and 3.to evaluate the prediction equations along with anthropometric indices and the best equations retrieved from the literature in health, metabolic illness and during weight loss. Methods: a Systematic review using Cochrane Review method was used for reviewing muscle mass estimation papers that used MRI as the reference method. Fat mass estimation papers were critically reviewed. Mixed ethnic, age and body mass data that underwent whole body magnetic resonance imaging to quantify adipose tissue and muscle mass (dependent variable) and anthropometry (independent variable) were used in the derivation/validation analysis. Multiple regression and Bland-Altman plot were applied to evaluate the prediction equations. To determine how well the equations identify metabolic illness, English and Scottish health surveys were studied. Statistical analysis using multiple regression and binary logistic regression were applied to assess model fit and associations. Also, populations were divided into quintiles and relative risk was analysed. Finally, the prediction equations were evaluated by applying them to a pilot study of 10 subjects who underwent whole-body MRI, anthropometric measurements and muscle strength before and after weight loss to determine how well the equations identify adipose/fat mass and muscle mass change. Results: The estimation of fat mass has serious problems. Despite advances in technology and science, prediction equations for the estimation of fat mass depend on limited historical reference data and remain dependent upon assumptions that have not yet been properly validated for different population groups. Muscle mass does not have the same conceptual problems; however, its measurement is still problematic and reference data are scarce. The derivation and validation analysis in this thesis was satisfactory, compared to prediction equations in the literature they were similar or even better. Applying the prediction equations in metabolic illness and during weight loss presented an understanding on how well the equations identify metabolic illness showing significant associations with diabetes, hypertension, HbA1c and blood pressure. And moderate to high correlations with MRI-measured adipose tissue and muscle mass before and after weight loss. Conclusion: Adipose tissue mass and to an extent muscle mass can now be estimated for many purposes as population or groups means. However, these equations must not be used for assessing fatness and categorising individuals. Further exploration in different populations and health surveys would be valuable.