991 resultados para Correlated inventory models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, UK industry has seen an explosive growth in the number of `Computer Aided Production Management' (CAPM) system installations. Of the many CAPM systems, materials requirement planning/manufacturing resource planning (MRP/MRPII) is the most widely implemented. Despite the huge investments in MRP systems, over 80 percent are said to have failed within 3 to 5 years of installation. Many people now assume that Just-In-Time (JIT) is the best manufacturing technique. However, those who have implemented JIT have found that it also has many problems. The author argues that the success of a manufacturing company will not be due to a system which complies with a single technique; but due to the integration of many techniques and the ability to make them complement each other in a specific manufacturing environment. This dissertation examines the potential for integrating MRP with JIT and Two-Bin systems to reduce operational costs involved in managing bought-out inventory. Within this framework it shows that controlling MRP is essential to facilitate the integrating process. The behaviour of MRP systems is dependent on the complex interactions between the numerous control parameters used. Methodologies/models are developed to set these parameters. The models are based on the Pareto principle. The idea is to use business targets to set a coherent set of parameters, which not only enables those business targets to be realised, but also facilitates JIT implementation. It illustrates this approach in the context of an actual manufacturing plant - IBM Havant. (IBM Havant is a high volume electronics assembly plant with the majority of the materials bought-out). The parameter setting models are applicable to control bought-out items in a wide range of industries and are not dependent on specific MRP software. The models have produced successful results in several companies and are now being developed as commercial products.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is concerned with the inventory control of items that can be considered independent of one another. The decisions when to order and in what quantity, are the controllable or independent variables in cost expressions which are minimised. The four systems considered are referred to as (Q, R), (nQ,R,T), (M,T) and (M,R,T). Wiith ((Q,R) a fixed quantity Q is ordered each time the order cover (i.e. stock in hand plus on order ) equals or falls below R, the re-order level. With the other three systems reviews are made only at intervals of T. With (nQ,R,T) an order for nQ is placed if on review the inventory cover is less than or equal to R, where n, which is an integer, is chosen at the time so that the new order cover just exceeds R. In (M, T) each order increases the order cover to M. Fnally in (M, R, T) when on review, order cover does not exceed R, enough is ordered to increase it to M. The (Q, R) system is examined at several levels of complexity, so that the theoretical savings in inventory costs obtained with more exact models could be compared with the increases in computational costs. Since the exact model was preferable for the (Q,R) system only exact models were derived for theoretical systems for the other three. Several methods of optimization were tried, but most were found inappropriate for the exact models because of non-convergence. However one method did work for each of the exact models. Demand is considered continuous, and with one exception, the distribution assumed is the normal distribution truncated so that demand is never less than zero. Shortages are assumed to result in backorders, not lost sales. However, the shortage cost is a function of three items, one of which, the backorder cost, may be either a linear, quadratic or an exponential function of the length of time of a backorder, with or without period of grace. Lead times are assumed constant or gamma distributed. Lastly, the actual supply quantity is allowed to be distributed. All the sets of equations were programmed for a KDF 9 computer and the computed performances of the four inventory control procedures are compared under each assurnption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analysis of the use of ICT in the aerospace industry has prompted the detailed investigation of an inventory-planning problem. There is a special class of inventory, consisting of expensive repairable spares for use in support of aircraft operations. These items, called rotables, are not well served by conventional theory and systems for inventory management. The context of the problem, the aircraft maintenance industry sector, is described in order to convey some of its special characteristics in the context of operations management. A literature review is carried out to seek existing theory that can be applied to rotable inventory and to identify a potential gap into which newly developed theory could contribute. Current techniques for rotable planning are identified in industry and the literature: these methods are modelled and tested using inventory and operational data obtained in the field. In the expectation that current practice leaves much scope for improvement, several new models are proposed. These are developed and tested on the field data for comparison with current practice. The new models are revised following testing to give improved versions. The best model developed and tested here comprises a linear programming optimisation, which finds an optimal level of inventory for multiple test cases, reflecting changing operating conditions. The new model offers an inventory plan that is up to 40% less expensive than that determined by current practice, while maintaining required performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quality, production and technological innovation management rank among the most important matters of concern to modern manufacturing organisations. They can provide companies with the decisive means of gaining a competitive advantage, especially within industries where there is an increasing similarity in product design and manufacturing processes. The papers in this special issue of International Journal of Technology Management have all been selected as examples of how aspects of quality, production and technological innovation can help to improve competitive performance. Most are based on presentations made at the UK Operations Management Association's Sixth International Conference held at Aston University at which the theme was 'Getting Ahead Through Technology and People'. At the conference itself over 80 papers were presented by authors from 15 countries around the world. Among the many topics addressed within the conference theme, technological innovation, quality and production management emerged as attracting the greatest concern and interest of delegates, particularly those from industry. For any new initiative to be implemented successfully, it should be led from the top of the organization. Achieving the desired level of commitment from top management can, however, be a difficulty. In the first paper of this issue, Mackness investigates this question by explaining how systems thinking can help. In the systems approach, properties such as 'emergence', 'hierarchy', 'commnication' and 'control' are used to assist top managers in preparing for change. Mackness's paper is then complemented by Iijima and Hasegawa's contribution in which they investigate the development of Quality Information Management (QIM) in Japan. They present the idea of a Design Review and demonstrate how it can be used to trace and reduce quality-related losses. The next paper on the subject of quality is by Whittle and colleagues. It relates to total quality and the process of culture change within organisations. Using the findings of investigations carried out in a number of case study companies, they describe four generic models which have been identified as characterising methods of implementing total quality within existing organisation cultures. Boaden and Dale's paper also relates to the management of quality, but looks specifically at the construction industry where it has been found there is still some confusion over the role of Quality Assurance (QA) and Total Quality Management (TQM). They describe the results of a questionnaire survey of forty companies in the industry and compare them to similar work carried out in other industries. Szakonyi's contribution then completes this group of papers which all relate specifically to the question of quality. His concern is with the two ways in which R&D or engineering managers can work on improving quality. The first is by improving it in the laboratory, while the second is by working with other functions to improve quality in the company. The next group of papers in this issue all address aspects of production management. Umeda's paper proposes a new manufacturing-oriented simulation package for production management which provides important information for both design and operation of manufacturing systems. A simulation for production strategy in a Computer Integrated Manufacturing (CIM) environment is also discussed. This paper is then followed by a contribution by Tanaka and colleagues in which they consider loading schedules for manufacturing orders in a Material Requirements Planning (MRP) environment. They compare mathematical programming with a knowledge-based approach, and comment on their relative effectiveness for different practical situations. Engstrom and Medbo's paper then looks at a particular aspect of production system design, namely the question of devising group working arrangements for assembly with new product structures. Using the case of a Swedish vehicle assembly plant where long cycle assembly work has been adopted, they advocate the use of a generally applicable product structure which can be adapted to suit individual local conditions. In the last paper of this particular group, Tay considers how automation has affected the production efficiency in Singapore. Using data from ten major industries he identifies several factors which are positively correlated with efficiency, with capital intensity being of greatest interest to policy makers. The two following papers examine the case of electronic data interchange (EDI) as a means of improving the efficiency and quality of trading relationships. Banerjee and Banerjee consider a particular approach to material provisioning for production systems using orderless inventory replenishment. Using the example of a single supplier and multiple buyers they develop an analytical model which is applicable for the exchange of information between trading partners using EDI. They conclude that EDI-based inventory control can be attractive from economic as well as other standpoints and that the approach is consistent with and can be instrumental in moving towards just-in-time (JIT) inventory management. Slacker's complementary viewpoint on EDI is from the perspective of the quality relation-ship between the customer and supplier. Based on the experience of Lucas, a supplier within the automotive industry, he concludes that both banks and trading companies must take responsibility for the development of payment mechanisms which satisfy the requirements of quality trading. The three final papers of this issue relate to technological innovation and are all country based. Berman and Khalil report on a survey of US technological effectiveness in the global economy. The importance of education is supported in their conclusions, although it remains unclear to what extent the US government can play a wider role in promoting technological innovation and new industries. The role of technology in national development is taken up by Martinsons and Valdemars who examine the case of the former Soviet Union. The failure to successfully infuse technology into Soviet enterprises is seen as a factor in that country's demise, and it is anticipated that the newly liberalised economies will be able to encourage greater technological creativity. This point is then taken up in Perminov's concluding paper which looks in detail at Russia. Here a similar analysis is made of the concluding paper which looks in detail at Russia. Here a similar analysis is made of the Soviet Union's technological decline, but a development strategy is also presented within the context of the change from a centralised to a free market economy. The papers included in this special issue of the International Journal of Technology Management each represent a unique and particular contribution to their own specific area of concern. Together, however, they also argue or demonstrate the general improvements in competitive performance that can be achieved through the application of modern principles and practice to the management of quality, production and technological innovation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Optimal design for parameter estimation in Gaussian process regression models with input-dependent noise is examined. The motivation stems from the area of computer experiments, where computationally demanding simulators are approximated using Gaussian process emulators to act as statistical surrogates. In the case of stochastic simulators, which produce a random output for a given set of model inputs, repeated evaluations are useful, supporting the use of replicate observations in the experimental design. The findings are also applicable to the wider context of experimental design for Gaussian process regression and kriging. Designs are proposed with the aim of minimising the variance of the Gaussian process parameter estimates. A heteroscedastic Gaussian process model is presented which allows for an experimental design technique based on an extension of Fisher information to heteroscedastic models. It is empirically shown that the error of the approximation of the parameter variance by the inverse of the Fisher information is reduced as the number of replicated points is increased. Through a series of simulation experiments on both synthetic data and a systems biology stochastic simulator, optimal designs with replicate observations are shown to outperform space-filling designs both with and without replicate observations. Guidance is provided on best practice for optimal experimental design for stochastic response models. © 2013 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2000 Mathematics Subject Classi cation: Primary 90C31. Secondary 62C12, 62P05, 93C41.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analysis of risk measures associated with price series data movements and its predictions are of strategic importance in the financial markets as well as to policy makers in particular for short- and longterm planning for setting up economic growth targets. For example, oilprice risk-management focuses primarily on when and how an organization can best prevent the costly exposure to price risk. Value-at-Risk (VaR) is the commonly practised instrument to measure risk and is evaluated by analysing the negative/positive tail of the probability distributions of the returns (profit or loss). In modelling applications, least-squares estimation (LSE)-based linear regression models are often employed for modeling and analyzing correlated data. These linear models are optimal and perform relatively well under conditions such as errors following normal or approximately normal distributions, being free of large size outliers and satisfying the Gauss-Markov assumptions. However, often in practical situations, the LSE-based linear regression models fail to provide optimal results, for instance, in non-Gaussian situations especially when the errors follow distributions with fat tails and error terms possess a finite variance. This is the situation in case of risk analysis which involves analyzing tail distributions. Thus, applications of the LSE-based regression models may be questioned for appropriateness and may have limited applicability. We have carried out the risk analysis of Iranian crude oil price data based on the Lp-norm regression models and have noted that the LSE-based models do not always perform the best. We discuss results from the L1, L2 and L∞-norm based linear regression models. ACM Computing Classification System (1998): B.1.2, F.1.3, F.2.3, G.3, J.2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The distinctive karstic, freshwater wetlands of the northern Caribbean and Central American region support the prolific growth of calcite-rich periphyton mats. Aside from the Everglades, very little research has been conducted in these karstic wetlands, which are increasingly threatened by eutrophication. This study sought to (i) test the hypothesis that water depth and periphyton total phosphorus (TP) content are both drivers of periphyton biomass in karstic wetland habitats in Belize, Mexico and Jamaica, (ii) provide a taxonomic inventory of the periphytic diatom species in these wetlands and (iii) examine the relationship between periphyton mat TP concentration and diatom assemblage at Everglades and Caribbean locations. ^ Periphyton biomass, nutrient and diatom assemblage data were generated from periphyton mat samples collected from shallow, marl-based wetlands in Belize, Mexico and Jamaica. These data were compared to a larger dataset collected from comparable sites within Everglades National Park. A diatom taxonomic inventory was conducted on the Caribbean samples and a combination of ordination and weighted-averaging modeling techniques were used to compare relationships between periphyton TP concentration, periphyton biomass and diatom assemblage composition among the locations. ^ Within the Everglades, periphyton biomass showed a negative correlation with water depth and mat TP, while periphyton mat percent organic content was positively correlated with these two variables. These patterns were also exhibited within the Belize, Mexico and Jamaica locations, suggesting that water depth and periphyton TP content are both drivers of periphyton biomass in karstic wetland systems within the northern Caribbean region. ^ A total of 146 diatom species representing 39 genera were recorded from the three Caribbean locations, including a distinct core group of species that may be endemic to this habitat type. Weighted averaging models were produced that effectively predicted mat TP concentration from diatom assemblages for both Everglades (R2=0.56) and Caribbean (R2=0.85) locations. There were, however, significant differences among Everglades and Caribbean locations with respect to species TP optima and indicator species. This suggests that although diatoms are effective indicators of water quality in these wetlands, differences in species response to water quality changes can reduce the predictive power of these indices when applied across systems. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is to explore the use of automated inventory management systems (IMS) and identify the stage of technology adoption for restaurants in Aruba. A case study analysis involving twelve members of the Aruba Gastronomic Association was conducted using a qualitative research design to gather information on approaches currently used as well as the reasons and perceptions managers/owners have for using or not using automated systems in their facilities. This is the first study conducted using the Aruba restaurant market. Therefore, the application of two technology adoption models was used to integrate critical factors relevant to the study. Major findings indicated the use of an automated IMS in restaurants is limited, thus underscoring the lack of adoption of technology in this area. The results also indicated that two major reasons that restaurants are not adopting IMS technology are budgetary constraints and service support. This study is imperative for two reasons: (1) the results of this study can be used as a comparison for future IMS adoption, not only for Aruba’s restaurant industry but also for other Caribbean destinations and the U.S., (2) this study also provides insight into the additional training and support help needed in hospitality technology services.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The role of spirituality in leadership in business and other organizations has gained growing recognition. The purpose of this study was to explore the relationship between spirituality and nine selected transformational leadership practices. Community leaders (N = 138) in business, education, and other professions who were graduates of a 10-week leadership program, Leadership Fort Lauderdale, from 1994 to 2004 completed the Spirituality Assessment Scale (SAS), the Leadership Practices Inventory (LPI), and four transformational leadership items of the Multifactor Leadership Questionnaire (MLQ). ^ The predictor variables were participants' scores on the LPI and MLQ. The criterion variable was their score on the SAS. Stepwise multiple regression analysis was used to test the hypothesis: Is there a combination of nine selected transformational leadership practices that would account for a significant portion of the variance of each of two spirituality measures? The Definitive and Correlated dimensions and Total spirituality score of the SAS were used in the analysis. ^ Results showed that two of the LPI leadership practices were significantly related to spirituality. The variable Inspiring a Shared Vision accounted for 10% of the variance of the SAS Definitive dimension. The variable Encouraging the Heart accounted for 30% of the variance of the Correlated dimension. For the Total spirituality score, two models were revealed. In the first model, Encouraging the Heart accounted for 28% of the variance of the total spirituality score. In the second model, Encouraging the Heart and Inspiring a Shared Vision together accounted for 31% of the total spirituality score. None of the transformational leadership practices from the MLQ were significantly related to spirituality. ^ The data partially support the hypothesis: two of the nine leadership variables did in combination correlate with leaders' spirituality. The results also support at least a partial relationship between spirituality and certain transformational leadership practices among leaders in various spheres, such as education, business, and other professions. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantitative Structure-Activity Relationship (QSAR) has been applied extensively in predicting toxicity of Disinfection By-Products (DBPs) in drinking water. Among many toxicological properties, acute and chronic toxicities of DBPs have been widely used in health risk assessment of DBPs. These toxicities are correlated with molecular properties, which are usually correlated with molecular descriptors. The primary goals of this thesis are: (1) to investigate the effects of molecular descriptors (e.g., chlorine number) on molecular properties such as energy of the lowest unoccupied molecular orbital (E LUMO) via QSAR modelling and analysis; (2) to validate the models by using internal and external cross-validation techniques; (3) to quantify the model uncertainties through Taylor and Monte Carlo Simulation. One of the very important ways to predict molecular properties such as ELUMO is using QSAR analysis. In this study, number of chlorine (NCl ) and number of carbon (NC) as well as energy of the highest occupied molecular orbital (EHOMO) are used as molecular descriptors. There are typically three approaches used in QSAR model development: (1) Linear or Multi-linear Regression (MLR); (2) Partial Least Squares (PLS); and (3) Principle Component Regression (PCR). In QSAR analysis, a very critical step is model validation after QSAR models are established and before applying them to toxicity prediction. The DBPs to be studied include five chemical classes: chlorinated alkanes, alkenes, and aromatics. In addition, validated QSARs are developed to describe the toxicity of selected groups (i.e., chloro-alkane and aromatic compounds with a nitro- or cyano group) of DBP chemicals to three types of organisms (e.g., Fish, T. pyriformis, and P.pyosphoreum) based on experimental toxicity data from the literature. The results show that: (1) QSAR models to predict molecular property built by MLR, PLS or PCR can be used either to select valid data points or to eliminate outliers; (2) The Leave-One-Out Cross-Validation procedure by itself is not enough to give a reliable representation of the predictive ability of the QSAR models, however, Leave-Many-Out/K-fold cross-validation and external validation can be applied together to achieve more reliable results; (3) E LUMO are shown to correlate highly with the NCl for several classes of DBPs; and (4) According to uncertainty analysis using Taylor method, the uncertainty of QSAR models is contributed mostly from NCl for all DBP classes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We developed diatom-based prediction models of hydrology and periphyton abundance to inform assessment tools for a hydrologically managed wetland. Because hydrology is an important driver of ecosystem change, hydrologic alterations by restoration efforts could modify biological responses, such as periphyton characteristics. In karstic wetlands, diatoms are particularly important components of mat-forming calcareous periphyton assemblages that both respond and contribute to the structural organization and function of the periphyton matrix. We examined the distribution of diatoms across the Florida Everglades landscape and found hydroperiod and periphyton biovolume were strongly correlated with assemblage composition. We present species optima and tolerances for hydroperiod and periphyton biovolume, for use in interpreting the directionality of change in these important variables. Predictions of these variables were mapped to visualize landscape-scale spatial patterns in a dominant driver of change in this ecosystem (hydroperiod) and an ecosystem-level response metric of hydrologic change (periphyton biovolume). Specific diatom assemblages inhabiting periphyton mats of differing abundance can be used to infer past conditions and inform management decisions based on how assemblages are changing. This study captures diatom responses to wide gradients of hydrology and periphyton characteristics to inform ecosystem-scale bioassessment efforts in a large wetland.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Preterm birth is a public health problem worldwide. It holds growing global incidence rates, high mortality rates and a risk of the long-term sequelae in the newborn. It is also poses burden on the family and society. Mothers of very low birth weight (VLBW) preterm infants may develop psychological disorders, and impaired quality of life (QoL). Factors related to mothers and children in the postpartum period may be negatively associated with the QoL of these mothers. The aim of this study was to assess factors possibly associated with the QoL of mothers of VLBW preterm newborns during the first three years after birth. Mothers of VLBW preterm answered the World Health Organization Quality of Life (WHOQOL)-bref and the Beck Depression Inventory (BDI) in five time points up to 36 months postpartum, totalizing 260 observations. The WHOQOL–bref scores were compared and correlated with sociodemographic and clinical variables of mothers and children at discharge (T0) and at six (T1), twelve (T2), 24 (T3) and 36 (T4) months after the delivery. We used the Kruskal Wallis test to compared scores across different time points and correlated WHOQOL-bref scores with the sociodemographic and clinical variables of mothers and preterm infants. Multiple linear regression models were used to evaluate the contribution of these variables for the QoL of mothers. The WHOQOL–bref scores at T1 and T2 were higher when compared to scores in T0 in the physical health dimension (p = 0.013). BDI scores were also higher at T1 and T2 than those at T0 (p = 0.027). Among the maternal variables that contributed most to the QoL of mothers, there were: at T0, stable marital union (b= 13.60; p= 0.000) on the social relationships dimension, gestational age (b= 2.38; p= 0.010) in the physical health dimension; post-hemorrhagic hydrocephalus (b= -10.05; p= 0.010; b= -12.18; p= 0.013, respectively) in the psychological dimension; at T1 and T2, Bronchopulmonary dysplasia (b= -7.41; p= 0.005) and female sex (b= 8,094; p= 0.011) in the physical health dimension and environment, respectively. At T3, family income (b= -12.75’ p= 0.001) in the environment dimension, the SNAPPE neonatal severity score (b= -0.23; p= 0.027) on the social relationships dimension; at the T4, evangelical religion (b= 8.11; p= 0.019) and post-hemorrhagic hydrocephalus (b: -18.84 p: 0.001) on the social relationships dimension. The BDI scores were negatively associated with WHOQOL scores in all dimensions and at all times points: (-1.42 ≤ b ≤ -0.36; T0, T1, T2, T3 and T4). We conclude that mothers of preterm infants VLBW tend to have a transient improvement in the physical well-being during the first postpartum year. Their quality of life seems to return to levels at discharge between two and three years after delivery. The presence of maternal depressive symptoms and diagnosis of post-hemorrhagic hydrocephalus or BDP are factors negatively associated with the QoL of mothers. Social, religious and economic variables are positively associated with the QoL of mothers of VLBW preterm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Forests change with changes in their environment based on the physiological responses of individual trees. These short-term reactions have cumulative impacts on long-term demographic performance. For a tree in a forest community, success depends on biomass growth to capture above- and belowground resources and reproductive output to establish future generations. Here we examine aspects of how forests respond to changes in moisture and light availability and how these responses are related to tree demography and physiology.

First we address the long-term pattern of tree decline before death and its connection with drought. Increasing drought stress and chronic morbidity could have pervasive impacts on forest composition in many regions. We use long-term, whole-stand inventory data from southeastern U.S. forests to show that trees exposed to drought experience multiyear declines in growth prior to mortality. Following a severe, multiyear drought, 72% of trees that did not recover their pre-drought growth rates died within 10 years. This pattern was mediated by local moisture availability. As an index of morbidity prior to death, we calculated the difference in cumulative growth after drought relative to surviving conspecifics. The strength of drought-induced morbidity varied among species and was correlated with species drought tolerance.

Next, we investigate differences among tree species in reproductive output relative to biomass growth with changes in light availability. Previous studies reach conflicting conclusions about the constraints on reproductive allocation relative to growth and how they vary through time, across species, and between environments. We test the hypothesis that canopy exposure to light, a critical resource, limits reproductive allocation by comparing long-term relationships between reproduction and growth for trees from 21 species in forests throughout the southeastern U.S. We found that species had divergent responses to light availability, with shade-intolerant species experiencing an alleviation of trade-offs between growth and reproduction at high light. Shade-tolerant species showed no changes in reproductive output across light environments.

Given that the above patterns depend on the maintenance of transpiration, we next developed an approach for predicting whole-tree water use from sap flux observations. Accurately scaling these observations to tree- or stand-levels requires accounting for variation in sap flux between wood types and with depth into the tree. We compared different models with sap flux data to test the hypotheses that radial sap flux profiles differ by wood type and tree size. We show that radial variation in sap flux is dependent on wood type but independent of tree size for a range of temperate trees. The best-fitting model predicted out-of-sample sap flux observations and independent estimates of sapwood area with small errors, suggesting robustness in new settings. We outline a method for predicting whole-tree water use with this model and include computer code for simple implementation in other studies.

Finally, we estimated tree water balances during drought with a statistical time-series analysis. Moisture limitation in forest stands comes predominantly from water use by the trees themselves, a drought-stand feedback. We show that drought impacts on tree fitness and forest composition can be predicted by tracking the moisture reservoir available to each tree in a mass balance. We apply this model to multiple seasonal droughts in a temperate forest with measurements of tree water use to demonstrate how species and size differences modulate moisture availability across landscapes. As trees deplete their soil moisture reservoir during droughts, a transpiration deficit develops, leading to reduced biomass growth and reproductive output.

This dissertation draws connections between the physiological condition of individual trees and their behavior in crowded, diverse, and continually-changing forest stands. The analyses take advantage of growing data sets on both the physiology and demography of trees as well as novel statistical techniques that allow us to link these observations to realistic quantitative models. The results can be used to scale up tree measurements to entire stands and address questions about the future composition of forests and the land’s balance of water and carbon.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Key life history traits such as breeding time and clutch size are frequently both heritable and under directional selection, yet many studies fail to document micro-evolutionary responses. One general explanation is that selection estimates are biased by the omission of correlated traits that have causal effects on fitness, but few valid tests of this exist. Here we show, using a quantitative genetic framework and six decades of life-history data on two free-living populations of great tits Parus major, that selection estimates for egg-laying date and clutch size are relatively unbiased. Predicted responses to selection based on the Robertson-Price Identity were similar to those based on the multivariate breeder’s equation, indicating that unmeasured covarying traits were not missing from the analysis. Changing patterns of phenotypic selection on these traits (for laying date, linked to climate change) therefore reflect changing selection on breeding values, and genetic constraints appear not to limit their independent evolution. Quantitative genetic analysis of correlational data from pedigreed populations can be a valuable complement to experimental approaches to help identify whether apparent associations between traits and fitness are biased by missing traits, and to parse the roles of direct versus indirect selection across a range of environments.