913 resultados para nature-based


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite football being deeply entrenched in Scottish culture it is under-researched from a business perspective. This research develops a conceptual framework that views professional football clubs from a number of different perspectives. It draws on strategic management literature since this views the firm as the intersection between internal competence, customer perception and competition within an industry. A review of previous sports business research highlighted five main themes that were used to create a structure for the analysis: on-field performance, attendance, finance, the playing squad and the manager. These themes were used as frames to view the firms within the industry from a number of different perspectives. Each frame allows a different aspect of the firm to be considered singly in turn and then collectively to develop a deeper understanding of the existing frames in use within the industry. The research is based on a pragmatic philosophy that allows mixed methods to be combined to provide both an objective and subjective view of the industry. The subjective view was drawn from five interviews with senior figures within Scottish professional football. These participants were from a number of different roles and organisations within the industry to provide a balance of experiences. The views were triangulated with a descriptive analysis of secondary data from a number of industry sources to establish patterns within and between these frames. A peer group of six clubs was selected as they competed in the Scottish Premier League in each of the seasons within an eleven-year period (2000-2011). The peer group clubs selected were: Aberdeen, Dundee United, Heart of Midlothian (Hearts), Hibernian, Kilmarnock and Motherwell. By focussing on a small group of clubs with a similar on-field record a broad study across the five frames could be carried out in detail without the findings being influenced by the impact of relegation to a lower division or sustained participation in European football. Within each of the original five frames a number of sub-components were identified and linked to the framework; this expanded the content to reflect the findings of this project. There appeared to be little link between on-field performance and attendance although progress to the later stages of cup competitions allowed clubs to connect with fans who do not regularly attend. The relationship between a club’s income and wage bill should be expanded to include interest repayments since this expenditure can be used to highlight future financial problems caused by increased debt levels. Although all of the interview participants spoke with pride of the players that had progressed from the club’s youth academy to success at the highest level the peer group clubs only produced one player each season that played more than ten matches for the club. Almost half of the players signed from the youth academy left the club without playing for the 1st Team. The importance of the relationship between the manager and club chairman was highlighted, although the speed with which managers were appointed suggests that little consideration was given to this before offering a contract. Once appointed there appeared to be little clarity over the job description and areas of responsibility. Several of the interviewees brought experience from other businesses to football but admitted that short-term decision making and entrenched behaviour made change difficult. The conclusion of the research is that by taking a firm-wide view of the club, longer-term decisions can be taken within football. Player development and supporter relationships were both identified as long-term processes that are impacted by the current short-termism. With greater role clarity for managers and a mixture of short and long-term objectives those involved in the industry are more likely to have opportunities to learn from experience and performance, across the different frames, will improve as a result.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent developments in automation, robotics and artificial intelligence have given a push to a wider usage of these technologies in recent years, and nowadays, driverless transport systems are already state-of-the-art on certain legs of transportation. This has given a push for the maritime industry to join the advancement. The case organisation, AAWA initiative, is a joint industry-academia research consortium with the objective of developing readiness for the first commercial autonomous solutions, exploiting state-of-the-art autonomous and remote technology. The initiative develops both autonomous and remote operation technology for navigation, machinery, and all on-board operating systems. The aim of this study is to develop a model with which to estimate and forecast the operational costs, and thus enable comparisons between manned and autonomous cargo vessels. The building process of the model is also described and discussed. Furthermore, the model’s aim is to track and identify the critical success factors of the chosen ship design, and to enable monitoring and tracking of the incurred operational costs as the life cycle of the vessel progresses. The study adopts the constructive research approach, as the aim is to develop a construct to meet the needs of a case organisation. Data has been collected through discussions and meeting with consortium members and researchers, as well as through written and internal communications material. The model itself is built using activity-based life cycle costing, which enables both realistic cost estimation and forecasting, as well as the identification of critical success factors due to the process-orientation adopted from activity-based costing and the statistical nature of Monte Carlo simulation techniques. As the model was able to meet the multiple aims set for it, and the case organisation was satisfied with it, it could be argued that activity-based life cycle costing is the method with which to conduct cost estimation and forecasting in the case of autonomous cargo vessels. The model was able to perform the cost analysis and forecasting, as well as to trace the critical success factors. Later on, it also enabled, albeit hypothetically, monitoring and tracking of the incurred costs. By collecting costs this way, it was argued that the activity-based LCC model is able facilitate learning from and continuous improvement of the autonomous vessel. As with the building process of the model, an individual approach was chosen, while still using the implementation and model building steps presented in existing literature. This was due to two factors: the nature of the model and – perhaps even more importantly – the nature of the case organisation. Furthermore, the loosely organised network structure means that knowing the case organisation and its aims is of great importance when conducting a constructive research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work three different metallic metamaterials (MMs) structures such as asymmetric split ring resonators (A-SRRs), dipole and split H-shaped (ASHs) structures that support plasmonic resonances have been developed. The aim of the work involves the optimization of photonic sensor based on plasmonic resonances and surface enhanced infrared absorption (SEIRA) from the MM structures. The MMs structures were designed to tune their plasmonic resonance peaks in the mid-infrared region. The plasmonic resonance peaks produced are highly dependent on the structural dimension and polarisation of the electromagnetic (EM) source. The ASH structure particularly has the ability to produce the plasmonic resonance peak with dual polarisation of the EM source. The double resonance peaks produced due to the asymmetric nature of the structures were optimized by varying the fundamental parameters of the design. These peaks occur due to hybridization of the individual elements of the MMs structure. The presence of a dip known as a trapped mode in between the double plasmonic peaks helps to narrow the resonances. A periodicity greater than twice the length and diameter of the metallic structure was applied to produce narrow resonances for the designed MMs. A nanoscale gap in each structure that broadens the trapped mode to narrow the plasmonic resonances was also used. A thickness of 100 nm gold was used to experimentally produce a high quality factor of 18 in the mid-infrared region. The optimised plasmonic resonance peaks was used for detection of an analyte, 17β-estradiol. 17β-estradiol is mostly responsible for the development of human sex organs and can be found naturally in the environment through human excreta. SEIRA was the method applied to the analysis of the analyte. The work is important in the monitoring of human biology and in water treatment. Applying this method to the developed nano-engineered structures, enhancement factors of 10^5 and a sensitivity of 2791 nm/RIU was obtained. With this high sensitivity a figure of merit (FOM) of 9 was also achieved from the sensors. The experiments were verified using numerical simulations where the vibrational resonances of the C-H stretch from 17β-estradiol were modelled. Lastly, A-SRRs and ASH on waveguides were also designed and evaluated. These patterns are to be use as basis for future work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Normal grain growth of calcite was investigated by combining grain size analysis of calcite across the contact aureole of the Adamello pluton, and grain growth modeling based on a thermal model of the surroundings of the pluton. In an unbiased model system, i.e., location dependent variations in temperature-time path, 2/3 and 1/3 of grain growth occurs during pro- and retrograde metamorphism at all locations, respectively. In contrast to this idealized situation, in the field example three groups can be distinguished, which are characterized by variations in their grain size versus temperature relationships: Group I occurs at low temperatures and the grain size remains constant because nano-scale second phase particles of organic origin inhibit grain growth in the calcite aggregates under these conditions. In the presence of an aqueous fluid, these second phases decay at a temperature of about 350 °C enabling the onset of grain growth in calcite. In the following growth period, fluid-enhanced group II and slower group III growth occurs. For group II a continuous and intense grain size increase with T is typical while the grain growth decreases with T for group III. None of the observed trends correlate with experimentally based grain growth kinetics, probably due to differences between nature and experiment which have not yet been investigated (e.g., porosity, second phases). Therefore, grain growth modeling was used to iteratively improve the correlation between measured and modeled grain sizes by optimizing activation energy (Q), pre-exponential factor (k0) and grain size exponent (n). For n=2, Q of 350 kJ/mol, k0 of 1.7×1021 μmns−1 and Q of 35 kJ/mol, k0 of 2.5×10-5 μmns−1 were obtained for group II and III, respectively. With respect to future work, field-data based grain growth modeling might be a promising tool for investigating the influences of secondary effects like porosity and second phases on grain growth in nature, and to unravel differences between nature and experiment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The immune system is a complex biological system with a highly distributed, adaptive and self-organising nature. This paper presents an Artificial Immune System (AIS) that exploits some of these characteristics and is applied to the task of film recommendation by Collaborative Filtering (CF). Natural evolution and in particular the immune system have not been designed for classical optimisation. However, for this problem, we are not interested in finding a single optimum. Rather we intend to identify a sub-set of good matches on which recommendations can be based. It is our hypothesis that an AIS built on two central aspects of the biological immune system will be an ideal candidate to achieve this: Antigen-antibody interaction for matching and idiotypic antibody-antibody interaction for diversity. Computational results are presented in support of this conjecture and compared to those found by other CF techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract-The immune system is a complex biological system with a highly distributed, adaptive and self-organising nature. This paper presents an artificial immune system (AIS) that exploits some of these characteristics and is applied to the task of film recommendation by collaborative filtering (CF). Natural evolution and in particular the immune system have not been designed for classical optimisation. However, for this problem, we are not interested in finding a single optimum. Rather we intend to identify a sub-set of good matches on which recommendations can be based. It is our hypothesis that an AIS built on two central aspects of the biological immune system will be an ideal candidate to achieve this: Antigen - antibody interaction for matching and antibody - antibody interaction for diversity. Computational results are presented in support of this conjecture and compared to those found by other CF techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The immune system is a complex biological system with a highly distributed, adaptive and self-organising nature. This paper presents an artificial immune system (AIS) that exploits some of these characteristics and is applied to the task of film recommendation by collaborative filtering (CF). Natural evolution and in particular the immune system have not been designed for classical optimisation. However, for this problem, we are not interested in finding a single optimum. Rather we intend to identify a sub-set of good matches on which recommendations can be based. It is our hypothesis that an AIS built on two central aspects of the biological immune system will be an ideal candidate to achieve this: Antigen - antibody interaction for matching and antibody - antibody interaction for diversity. Computational results are presented in support of this conjecture and compared to those found by other CF techniques. Notes: Uwe Aickelin, University of the West of England, Coldharbour Lane, Bristol, BS16 1QY, UK

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The immune system is a complex biological system with a highly distributed, adaptive and self-organising nature. This paper presents an Artificial Immune System (AIS) that exploits some of these characteristics and is applied to the task of film recommendation by Collaborative Filtering (CF). Natural evolution and in particular the immune system have not been designed for classical optimisation. However, for this problem, we are not interested in finding a single optimum. Rather we intend to identify a sub-set of good matches on which recommendations can be based. It is our hypothesis that an AIS built on two central aspects of the biological immune system will be an ideal candidate to achieve this: Antigen-antibody interaction for matching and idiotypic antibody-antibody interaction for diversity. Computational results are presented in support of this conjecture and compared to those found by other CF techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurate estimation of road pavement geometry and layer material properties through the use of proper nondestructive testing and sensor technologies is essential for evaluating pavement’s structural condition and determining options for maintenance and rehabilitation. For these purposes, pavement deflection basins produced by the nondestructive Falling Weight Deflectometer (FWD) test data are commonly used. The nondestructive FWD test drops weights on the pavement to simulate traffic loads and measures the created pavement deflection basins. Backcalculation of pavement geometry and layer properties using FWD deflections is a difficult inverse problem, and the solution with conventional mathematical methods is often challenging due to the ill-posed nature of the problem. In this dissertation, a hybrid algorithm was developed to seek robust and fast solutions to this inverse problem. The algorithm is based on soft computing techniques, mainly Artificial Neural Networks (ANNs) and Genetic Algorithms (GAs) as well as the use of numerical analysis techniques to properly simulate the geomechanical system. A widely used pavement layered analysis program ILLI-PAVE was employed in the analyses of flexible pavements of various pavement types; including full-depth asphalt and conventional flexible pavements, were built on either lime stabilized soils or untreated subgrade. Nonlinear properties of the subgrade soil and the base course aggregate as transportation geomaterials were also considered. A computer program, Soft Computing Based System Identifier or SOFTSYS, was developed. In SOFTSYS, ANNs were used as surrogate models to provide faster solutions of the nonlinear finite element program ILLI-PAVE. The deflections obtained from FWD tests in the field were matched with the predictions obtained from the numerical simulations to develop SOFTSYS models. The solution to the inverse problem for multi-layered pavements is computationally hard to achieve and is often not feasible due to field variability and quality of the collected data. The primary difficulty in the analysis arises from the substantial increase in the degree of non-uniqueness of the mapping from the pavement layer parameters to the FWD deflections. The insensitivity of some layer properties lowered SOFTSYS model performances. Still, SOFTSYS models were shown to work effectively with the synthetic data obtained from ILLI-PAVE finite element solutions. In general, SOFTSYS solutions very closely matched the ILLI-PAVE mechanistic pavement analysis results. For SOFTSYS validation, field collected FWD data were successfully used to predict pavement layer thicknesses and layer moduli of in-service flexible pavements. Some of the very promising SOFTSYS results indicated average absolute errors on the order of 2%, 7%, and 4% for the Hot Mix Asphalt (HMA) thickness estimation of full-depth asphalt pavements, full-depth pavements on lime stabilized soils and conventional flexible pavements, respectively. The field validations of SOFTSYS data also produced meaningful results. The thickness data obtained from Ground Penetrating Radar testing matched reasonably well with predictions from SOFTSYS models. The differences observed in the HMA and lime stabilized soil layer thicknesses observed were attributed to deflection data variability from FWD tests. The backcalculated asphalt concrete layer thickness results matched better in the case of full-depth asphalt flexible pavements built on lime stabilized soils compared to conventional flexible pavements. Overall, SOFTSYS was capable of producing reliable thickness estimates despite the variability of field constructed asphalt layer thicknesses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the ever-growing amount of connected sensors (IoT), making sense of sensed data becomes even more important. Pervasive computing is a key enabler for sustainable solutions, prominent examples are smart energy systems and decision support systems. A key feature of pervasive systems is situation awareness which allows a system to thoroughly understand its environment. It is based on external interpretation of data and thus relies on expert knowledge. Due to the distinct nature of situations in different domains and applications, the development of situation aware applications remains a complex process. This thesis is concerned with a general framework for situation awareness which simplifies the development of applications. It is based on the Situation Theory Ontology to provide a foundation for situation modelling which allows knowledge reuse. Concepts of the Situation Theory are mapped to the Context Space Theory which is used for situation reasoning. Situation Spaces in the Context Space are automatically generated with the defined knowledge. For the acquisition of sensor data, the IoT standards O-MI/O-DF are integrated into the framework. These allow a peer-to-peer data exchange between data publisher and the proposed framework and thus a platform independent subscription to sensed data. The framework is then applied for a use case to reduce food waste. The use case validates the applicability of the framework and furthermore serves as a showcase for a pervasive system contributing to the sustainability goals. Leading institutions, e.g. the United Nations, stress the need for a more resource efficient society and acknowledge the capability of ICT systems. The use case scenario is based on a smart neighbourhood in which the system recommends the most efficient use of food items through situation awareness to reduce food waste at consumption stage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Single-cell functional proteomics assays can connect genomic information to biological function through quantitative and multiplex protein measurements. Tools for single-cell proteomics have developed rapidly over the past 5 years and are providing unique opportunities. This thesis describes an emerging microfluidics-based toolkit for single cell functional proteomics, focusing on the development of the single cell barcode chips (SCBCs) with applications in fundamental and translational cancer research.

The microchip designed to simultaneously quantify a panel of secreted, cytoplasmic and membrane proteins from single cells will be discussed at the beginning, which is the prototype for subsequent proteomic microchips with more sophisticated design in preclinical cancer research or clinical applications. The SCBCs are a highly versatile and information rich tool for single-cell functional proteomics. They are based upon isolating individual cells, or defined number of cells, within microchambers, each of which is equipped with a large antibody microarray (the barcode), with between a few hundred to ten thousand microchambers included within a single microchip. Functional proteomics assays at single-cell resolution yield unique pieces of information that significantly shape the way of thinking on cancer research. An in-depth discussion about analysis and interpretation of the unique information such as functional protein fluctuations and protein-protein correlative interactions will follow.

The SCBC is a powerful tool to resolve the functional heterogeneity of cancer cells. It has the capacity to extract a comprehensive picture of the signal transduction network from single tumor cells and thus provides insight into the effect of targeted therapies on protein signaling networks. We will demonstrate this point through applying the SCBCs to investigate three isogenic cell lines of glioblastoma multiforme (GBM).

The cancer cell population is highly heterogeneous with high-amplitude fluctuation at the single cell level, which in turn grants the robustness of the entire population. The concept that a stable population existing in the presence of random fluctuations is reminiscent of many physical systems that are successfully understood using statistical physics. Thus, tools derived from that field can probably be applied to using fluctuations to determine the nature of signaling networks. In the second part of the thesis, we will focus on such a case to use thermodynamics-motivated principles to understand cancer cell hypoxia, where single cell proteomics assays coupled with a quantitative version of Le Chatelier's principle derived from statistical mechanics yield detailed and surprising predictions, which were found to be correct in both cell line and primary tumor model.

The third part of the thesis demonstrates the application of this technology in the preclinical cancer research to study the GBM cancer cell resistance to molecular targeted therapy. Physical approaches to anticipate therapy resistance and to identify effective therapy combinations will be discussed in detail. Our approach is based upon elucidating the signaling coordination within the phosphoprotein signaling pathways that are hyperactivated in human GBMs, and interrogating how that coordination responds to the perturbation of targeted inhibitor. Strongly coupled protein-protein interactions constitute most signaling cascades. A physical analogy of such a system is the strongly coupled atom-atom interactions in a crystal lattice. Similar to decomposing the atomic interactions into a series of independent normal vibrational modes, a simplified picture of signaling network coordination can also be achieved by diagonalizing protein-protein correlation or covariance matrices to decompose the pairwise correlative interactions into a set of distinct linear combinations of signaling proteins (i.e. independent signaling modes). By doing so, two independent signaling modes – one associated with mTOR signaling and a second associated with ERK/Src signaling have been resolved, which in turn allow us to anticipate resistance, and to design combination therapies that are effective, as well as identify those therapies and therapy combinations that will be ineffective. We validated our predictions in mouse tumor models and all predictions were borne out.

In the last part, some preliminary results about the clinical translation of single-cell proteomics chips will be presented. The successful demonstration of our work on human-derived xenografts provides the rationale to extend our current work into the clinic. It will enable us to interrogate GBM tumor samples in a way that could potentially yield a straightforward, rapid interpretation so that we can give therapeutic guidance to the attending physicians within a clinical relevant time scale. The technical challenges of the clinical translation will be presented and our solutions to address the challenges will be discussed as well. A clinical case study will then follow, where some preliminary data collected from a pediatric GBM patient bearing an EGFR amplified tumor will be presented to demonstrate the general protocol and the workflow of the proposed clinical studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ligand-protein docking is an optimization problem based on predicting the position of a ligand with the lowest binding energy in the active site of the receptor. Molecular docking problems are traditionally tackled with single-objective, as well as with multi-objective approaches, to minimize the binding energy. In this paper, we propose a novel multi-objective formulation that considers: the Root Mean Square Deviation (RMSD) difference in the coordinates of ligands and the binding (intermolecular) energy, as two objectives to evaluate the quality of the ligand-protein interactions. To determine the kind of Pareto front approximations that can be obtained, we have selected a set of representative multi-objective algorithms such as NSGA-II, SMPSO, GDE3, and MOEA/D. Their performances have been assessed by applying two main quality indicators intended to measure convergence and diversity of the fronts. In addition, a comparison with LGA, a reference single-objective evolutionary algorithm for molecular docking (AutoDock) is carried out. In general, SMPSO shows the best overall results in terms of energy and RMSD (value lower than 2A for successful docking results). This new multi-objective approach shows an improvement over the ligand-protein docking predictions that could be promising in in silico docking studies to select new anticancer compounds for therapeutic targets that are multidrug resistant.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current institutions, research, and legislation have not yet been sufficient to achieve the conservation level of Nature as required by the society. One of the reasons that explains this relative failure is the lack of incentives to motivate local individual and Nature users in general, to adopt behaviour compliant with Nature sustainable uses. Economists believe that, from the welfare point of view, pricing is the more efficient way to make economic actors to take more environmental friendly decisions. In this paper we will discuss how efficient can be the act of pricing the recreation use of a specific natural area, in terms of maximising welfare. The main conservation issues for pricing recreation use, as well as the conditions under which pricing will be an efficient and fair instrument for the natural area will be outlined. We will conclude two things. Firstly that, from the rational utilitarian economic behaviour point of view, economic efficiency can only be achieved if the natural area has positive and known recreation marginal costs under the relevant range of the marshallian demand recreation curve and if price system management is not costly. Secondly, in order to guarantee equity for the different type of visitors when charging the fee, it is necessary to discuss differential price systems. We shall see that even if marginal recreation costs exist but are unknown, pricing recreation is still an equity instrument and a useful one from the conservation perspective, as we shall demonstrate through an empirical application to the Portuguese National Park. An individual Travel Cost Method Approach will be used to estimate the recreation price that will be set equal to the visitor’s marginal willingness to pay for a day of visit in the national park. Although not efficient, under certain conditions this can be considered a fair pricing practice, because some of the negative recreation externalities will be internalised. We shall discuss the conditions that guarantee equity on charging for the Portuguese case.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Obstacle courses are an activity that brings a group of people together, in which teams are organized to navigate an established route during a set period of time, completing tasks (clues, riddles, or challenges) to meet a defined goal.  Rules and safety norms are explained to all participants; however, they are not informed of the location of the clues, riddles, or challenges.  The following should be considered when organizing an obstacle course: objective, topic to be developed, location, materials needed, clues, riddles or challenges that may be included, and how to supervise that all teams pass the checkpoints.  Like any other activity with a touch of competitiveness, fair play and respect should be above any interest.  If, for any reason, any of the teams has an emergency, solidarity should prevail, and the activity can be used to teach values.  An adventurous spirit is also essential in this activity.  The desire for the unknown and the new challenges individuals and groups.  This activity helps groups of friends, children, adults, families, etc. share a nice and healthy day together in contact with nature, rescuing concepts such as cooperation, cleverness and, particularly, team work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CONTEXT: No study has documented how symptomatic morbidity varies across the body mass index (BMI) spectrum (underweight, normal weight, overweight and obese) or across the entire child and adolescent age range. OBJECTIVE: To (1) quantify physical and psychosocial morbidities experienced by 2-18-year-olds according to BMI status and (2) explore morbidity patterns by age. DESIGN, SETTING AND PARTICIPANTS: Cross-sectional data from two Australian population studies (the Longitudinal Study of Australian Children and the Health of Young Victorians Study) were collected during 2000-2006. Participants were grouped into five age bands: 2-3 (n=4606), 4-5 (n=4983), 6-7 (n=4464), 8-12 (n=1541) and 13-18 (n=928) years. MAIN MEASURES: Outcomes-Parent- and self-reported global health; physical, psychosocial and mental health; special health-care needs; wheeze; asthma and sleep problems. Exposure-measured BMI (kg m(-2)) categorised using standard international cutpoints. ANALYSES: The variation in comorbidities across BMI categories within and between age bands was examined using linear and logistic regression models. RESULTS: Comorbidities varied with BMI category for all except sleep problems, generally showing the highest levels for the obese category. However, patterns differed markedly between age groups. In particular, poorer global health and special health-care needs were associated with underweight in young children, but obesity in older children. Prevalence of poorer physical health varied little by BMI in 2-5-year-olds, but from 6 to 7 years was increasingly associated with obesity. Normal-weight children tended to experience the best psychosocial and mental health, with little evidence that the U-shaped associations of these variables with BMI status varied by age. Wheeze and asthma increased slightly with BMI at all ages. CONCLUSIONS: Deviation from normal weight is associated with health differences in children and adolescents that vary by morbidity and age. As well as lowering risks for later disease, promoting normal body weight appears central to improving the health and well-being of the young.