127 resultados para APPORTIONMENT
Resumo:
We investigate the coevolution between philopatry and altruism in island-model populations when kin recognition occurs through phenotype matching. In saturated environments, a good discrimination ability is a necessary prerequisite for the emergence of sociality. Discrimination decreases not only with the average phenotypic similarity between immigrants and residents (i.e., with environmental homogeneity and past gene flow) but also with the sampling variance of similarity distributions (a negative function of the number of traits sampled). Whether discrimination should rely on genetically or environmentally determined traits depends on the apportionment of phenotypic variance and, in particular, on the relative values of e (the among-group component of environmental variance) and r (the among-group component of genetic variance, which also measures relatedness among group members). If r exceeds e, highly heritable cues do better. Discrimination and altruism, however, remain low unless philopatry is enforced by ecological constraints. If e exceeds r, by contrast, nonheritable traits do better. High e values improve discrimination drastically and thus have the potential to drive sociality, even in the absence of ecological constraints. The emergence of sociality thus can be facilitated by enhancing e, which we argue is the main purpose of cue standardization within groups, as observed in many social insects, birds, and mammals, including humans.
Resumo:
Seloste artikkelista: Koskela, L., Sinha, B. K. & Nummi, T. 2007. Some aspects of the sampling distribution of the apportionment index and related inference. Silva Fennica 41 (4) : 699-715
Resumo:
The processes and sources that regulate the elemental composition of aerosol particles were investigated in both fine and coarse modes during the dry and wet seasons. One hundred and nine samples were collected from the biological reserve Cuieiras - Manaus from February to October 2008, and analyzed together with 668 samples that were previously collected at Balbina from 1998 to 2002. Particle induced X-ray emission technique was used to determine the elemental composition, while the concentration of black carbon was obtained from the measurement of optical reflectance. Absolute principal factor analysis and positive matrix factorization were performed for source apportionment, which was complemented with back trajectory analysis. A regional identity for the natural biogenic aerosol was found for the Central Amazon Basin and can be used in dynamical chemical region models.
Resumo:
This thesis studies the use of heuristic algorithms in a number of combinatorial problems that occur in various resource constrained environments. Such problems occur, for example, in manufacturing, where a restricted number of resources (tools, machines, feeder slots) are needed to perform some operations. Many of these problems turn out to be computationally intractable, and heuristic algorithms are used to provide efficient, yet sub-optimal solutions. The main goal of the present study is to build upon existing methods to create new heuristics that provide improved solutions for some of these problems. All of these problems occur in practice, and one of the motivations of our study was the request for improvements from industrial sources. We approach three different resource constrained problems. The first is the tool switching and loading problem, and occurs especially in the assembly of printed circuit boards. This problem has to be solved when an efficient, yet small primary storage is used to access resources (tools) from a less efficient (but unlimited) secondary storage area. We study various forms of the problem and provide improved heuristics for its solution. Second, the nozzle assignment problem is concerned with selecting a suitable set of vacuum nozzles for the arms of a robotic assembly machine. It turns out that this is a specialized formulation of the MINMAX resource allocation formulation of the apportionment problem and it can be solved efficiently and optimally. We construct an exact algorithm specialized for the nozzle selection and provide a proof of its optimality. Third, the problem of feeder assignment and component tape construction occurs when electronic components are inserted and certain component types cause tape movement delays that can significantly impact the efficiency of printed circuit board assembly. Here, careful selection of component slots in the feeder improves the tape movement speed. We provide a formal proof that this problem is of the same complexity as the turnpike problem (a well studied geometric optimization problem), and provide a heuristic algorithm for this problem.
Resumo:
Fresh egg-weights and feeding rates to chicks were related to chick survival as one means of quantifying apportionment of parental investment wi thin broods of Caspian Terns (SterDI casRla) at a colony in Georgian Bay. Lake Huron, during 1978 and 1979. Ftrst-laid eggs from 2-egg clutches were Significantly heavier and usually hatched one to three days earlier than second-laid eggs in both years of the study. In both years, first-hatched chicks were larger and generally better fed than second-hatched siblings. The disparity between feedIng rates of first- and second-hatched ehicks was greater in 1979. Brood feeding I rates correlated positively with the percentage of food fed to the least-fed sibUng through the period of B-chick ages zero to 10 days in 1978. I suggest that after this age period, parental control over whlcb cbick was fed diminished. In 1978, 10 of 16 secondhatched chicks were fed more than their older siblings during their first 5 days. 'lb.is is interpreted as a parental response to reduce the competitive advantage of the larger first-hatched chicks. Most chick losses were apparently caused by starvation or preda. tion. In 1979, seeorvl-hatched chick disappearance (due to predation) was -related to low feeding rates, whereas first-hatched chick disappearance was related to low fresh egg-weights.. First-hatched chicks survived better than second-hatched chicks both years, and more pairs fledged two chicks in 1978. Maximum estimated feeding rates at the nest and fledging ages suggested that food was more avatlable in 1978 than in 1979. In 1979, second eggs apparently functioned as "insurance" eggs. When the first-laid egg falled to hatch, or the first-hatched chick died, the second-hatched chick was often successfully fledged. When first-hatched chicks survived, the second-hatched chick usually starved or was preyed upon, reducing the brood to one chick. Parental investment patterns favored first-hatched chicks. Brood reduction, when employed, discouraged total nest failure, however, under appropriate conditions, brood reduction was avoided and full broods (or two chicks) were fledged.
Resumo:
Emergent molecular measurement methods, such as DNA microarray, qRTPCR, and many others, offer tremendous promise for the personalized treatment of cancer. These technologies measure the amount of specific proteins, RNA, DNA or other molecular targets from tumor specimens with the goal of “fingerprinting” individual cancers. Tumor specimens are heterogeneous; an individual specimen typically contains unknown amounts of multiple tissues types. Thus, the measured molecular concentrations result from an unknown mixture of tissue types, and must be normalized to account for the composition of the mixture. For example, a breast tumor biopsy may contain normal, dysplastic and cancerous epithelial cells, as well as stromal components (fatty and connective tissue) and blood and lymphatic vessels. Our diagnostic interest focuses solely on the dysplastic and cancerous epithelial cells. The remaining tissue components serve to “contaminate” the signal of interest. The proportion of each of the tissue components changes as a function of patient characteristics (e.g., age), and varies spatially across the tumor region. Because each of the tissue components produces a different molecular signature, and the amount of each tissue type is specimen dependent, we must estimate the tissue composition of the specimen, and adjust the molecular signal for this composition. Using the idea of a chemical mass balance, we consider the total measured concentrations to be a weighted sum of the individual tissue signatures, where weights are determined by the relative amounts of the different tissue types. We develop a compositional source apportionment model to estimate the relative amounts of tissue components in a tumor specimen. We then use these estimates to infer the tissuespecific concentrations of key molecular targets for sub-typing individual tumors. We anticipate these specific measurements will greatly improve our ability to discriminate between different classes of tumors, and allow more precise matching of each patient to the appropriate treatment
Resumo:
The Integrated Catchment Model of Nitrogen (INCA-N) was applied to the River Lambourn, a Chalk river-system in southern England. The model's abilities to simulate the long-term trend and seasonal patterns in observed stream water nitrate concentrations from 1920 to 2003 were tested. This is the first time a semi-distributed, daily time-step model has been applied to simulate such a long time period and then used to calculate detailed catchment nutrient budgets which span the conversion of pasture to arable during the late 1930s and 1940s. Thus, this work goes beyond source apportionment and looks to demonstrate how such simulations can be used to assess the state of the catchment and develop an understanding of system behaviour. The mass-balance results from 1921, 1922, 1991, 2001 and 2002 are presented and those for 1991 are compared to other modelled and literature values of loads associated with nitrogen soil processes and export. The variations highlighted the problem of comparing modelled fluxes with point measurements but proved useful for identifying the most poorly understood inputs and processes thereby providing an assessment of input data and model structural uncertainty. The modelled terrestrial and instream mass-balances also highlight the importance of the hydrological conditions in pollutant transport. Between 1922 and 2002, increased inputs of nitrogen from fertiliser, livestock and deposition have altered the nitrogen balance with a shift from possible reduction in soil fertility but little environmental impact in 1922, to a situation of nitrogen accumulation in the soil, groundwater and instream biota in 2002. In 1922 and 2002 it was estimated that approximately 2 and 18 kg N ha(-1) yr(-1) respectively were exported from the land to the stream. The utility of the approach and further considerations for the best use of models are discussed. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
The Integrated Catchment Model of Nitrogen (INCA-N) was applied to the River Lambourn, a Chalk river-system in southern England. The model's abilities to simulate the long-term trend and seasonal patterns in observed stream water nitrate concentrations from 1920 to 2003 were tested. This is the first time a semi-distributed, daily time-step model has been applied to simulate such a long time period and then used to calculate detailed catchment nutrient budgets which span the conversion of pasture to arable during the late 1930s and 1940s. Thus, this work goes beyond source apportionment and looks to demonstrate how such simulations can be used to assess the state of the catchment and develop an understanding of system behaviour. The mass-balance results from 1921, 1922, 1991, 2001 and 2002 are presented and those for 1991 are compared to other modelled and literature values of loads associated with nitrogen soil processes and export. The variations highlighted the problem of comparing modelled fluxes with point measurements but proved useful for identifying the most poorly understood inputs and processes thereby providing an assessment of input data and model structural uncertainty. The modelled terrestrial and instream mass-balances also highlight the importance of the hydrological conditions in pollutant transport. Between 1922 and 2002, increased inputs of nitrogen from fertiliser, livestock and deposition have altered the nitrogen balance with a shift from possible reduction in soil fertility but little environmental impact in 1922, to a situation of nitrogen accumulation in the soil, groundwater and instream biota in 2002. In 1922 and 2002 it was estimated that approximately 2 and 18 kg N ha(-1) yr(-1) respectively were exported from the land to the stream. The utility of the approach and further considerations for the best use of models are discussed. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Formal and analytical risk models prescribe how risk should be incorporated in construction bids. However, the actual process of how contractors and their clients negotiate and agree on price is complex, and not clearly articulated in the literature. Using participant observation, the entire tender process was shadowed in two leading UK construction firms. This was compared to propositions in analytical models and significant differences were found. 670 hours of work observed in both firms revealed three stages of the bidding process. Bidding activities were categorized and their extent estimated as deskwork (32%), calculations (19%), meetings (14%), documents (13%), off-days (11%), conversations (7%), correspondence (3%) and travel (1%). Risk allowances of 1-2% were priced in some bids and three tiers of risk apportionment in bids were identified. However, priced risks may sometimes be excluded from the final bidding price to enhance competitiveness. Thus, although risk apportionment affects a contractor’s pricing strategy, other complex, microeconomic factors also affect price. Instead of pricing in contingencies, risk was priced mostly through contractual rather than price mechanisms, to reflect commercial imperatives. The findings explain why some assumptions underpinning analytical models may not be sustainable in practice and why what actually happens in practice is important for those who seek to model the pricing of construction bids.
Resumo:
Formal and analytical models that contractors can use to assess and price project risk at the tender stage have proliferated in recent years. However, they are rarely used in practice. Introducing more models would, therefore, not necessarily help. A better understanding is needed of how contractors arrive at a bid price in practice, and how, and in what circumstances, risk apportionment actually influences pricing levels. More than 60 proposed risk models for contractors that are published in journals were examined and classified. Then exploratory interviews with five UK contractors and documentary analyses on how contractors price work generally and risk specifically were carried out to help in comparing the propositions from the literature to what contractors actually do. No comprehensive literature on the real bidding processes used in practice was found, and there is no evidence that pricing is systematic. Hence, systematic risk and pricing models for contractors may have no justifiable basis. Contractors process their bids through certain tendering gateways. They acknowledge the risk that they should price. However, the final settlement depends on a set of complex, micro-economic factors. Hence, risk accountability may be smaller than its true cost to the contractor. Risk apportionment occurs at three stages of the whole bid-pricing process. However, analytical approaches tend not to incorporate this, although they could.
Resumo:
One of the aims of a broad ethnographic study into how the apportionment of risk influences pricing levels of contactors was to ascertain the significant risks affecting contractors in Ghana, and their impact on prices. To do this, in the context of contractors, the difference between expected and realized return on a project is the key dependent variable examined using documentary analyses and semi-structured interviews. Most work in this has focused on identifying and prioritising risks using relative importance indices generated from the analysis of questionnaire survey responses. However, this approach may be argued to constitute perceptions rather than direct measures of the project risk. Here, instead, project risk is investigated by examining two measures of the same quantity; one ‘before’ and one ‘after’ construction of a project has taken place. Risks events are identified by ascertaining the independent variables causing deviations between expected and actual rates of return. Risk impact is then measured by ascertaining additions or reductions to expected costs due to the occurrence of risk events. So far, data from eight substantially complete building projects indicates that consultants’ inefficiency, payment delays, subcontractor-related problems and changes in macroeconomic factors are significant risks affecting contractors in Ghana.
Resumo:
The Water Framework Directive has caused a paradigm shift towards the integrated management of recreational water quality through the development of drainage basin-wide programmes of measures. This has increased the need for a cost-effective diagnostic tool capable of accurately predicting riverine faecal indicator organism (FIO) concentrations. This paper outlines the application of models developed to fulfil this need, which represent the first transferrable generic FIO models to be developed for the UK to incorporate direct measures of key FIO sources (namely human and livestock population data) as predictor variables. We apply a recently developed transfer methodology, which enables the quantification of geometric mean presumptive faecal coliforms and presumptive intestinal enterococci concentrations for base- and high-flow during the summer bathing season in unmonitored UK watercourses, to predict FIO concentrations in the Humber river basin district. Because the FIO models incorporate explanatory variables which allow the effects of policy measures which influence livestock stocking rates to be assessed, we carry out empirical analysis of the differential effects of seven land use management and policy instruments (fiscal constraint, production constraint, cost intervention, area intervention, demand-side constraint, input constraint, and micro-level land use management) all of which can be used to reduce riverine FIO concentrations. This research provides insights into FIO source apportionment, explores a selection of pollution remediation strategies and the spatial differentiation of land use policies which could be implemented to deliver river quality improvements. All of the policy tools we model reduce FIO concentrations in rivers but our research suggests that the installation of streamside fencing in intensive milk producing areas may be the single most effective land management strategy to reduce riverine microbial pollution.
The Impact of office productivity cloud computing on energy consumption and greenhouse gas emissions
Resumo:
Cloud computing is usually regarded as being energy efficient and thus emitting less greenhouse gases (GHG) than traditional forms of computing. When the energy consumption of Microsoft’s cloud computing Office 365 (O365) and traditional Office 2010 (O2010) software suites were tested and modeled, some cloud services were found to consume more energy than the traditional form. The developed model in this research took into consideration the energy consumption at the three main stages of data transmission; data center, network, and end user device. Comparable products from each suite were selected and activities were defined for each product to represent a different computing type. Microsoft provided highly confidential data for the data center stage, while the networking and user device stages were measured directly. A new measurement and software apportionment approach was defined and utilized allowing the power consumption of cloud services to be directly measured for the user device stage. Results indicated that cloud computing is more energy efficient for Excel and Outlook which consumed less energy and emitted less GHG than the standalone counterpart. The power consumption of the cloud based Outlook (8%) and Excel (17%) was lower than their traditional counterparts. However, the power consumption of the cloud version of Word was 17% higher than its traditional equivalent. A third mixed access method was also measured for Word which emitted 5% more GHG than the traditional version. It is evident that cloud computing may not provide a unified way forward to reduce energy consumption and GHG. Direct conversion from the standalone package into the cloud provision platform can now consider energy and GHG emissions at the software development and cloud service design stage using the methods described in this research.
Resumo:
The REgents PARk and Tower Environmental Experiment (REPARTEE) comprised two campaigns in London in October 2006 and October/November 2007. The experiment design involved measurements at a heavily trafficked roadside site, two urban background sites and an elevated site at 160–190 m above ground on the BT Tower, supplemented in the second campaign by Doppler lidar measurements of atmospheric vertical structure. A wide range of measurements of airborne particle physical metrics and chemical composition were made as well as measurements of a considerable range of gas phase species and the fluxes of both particulate and gas phase substances. Significant findings include (a) demonstration of the evaporation of traffic-generated nanoparticles during both horizontal and vertical atmospheric transport; (b) generation of a large base of information on the fluxes of nanoparticles, accumulation mode particles and specific chemical components of the aerosol and a range of gas phase species, as well as the elucidation of key processes and comparison with emissions inventories; (c) quantification of vertical gradients in selected aerosol and trace gas species which has demonstrated the important role of regional transport in influencing concentrations of sulphate, nitrate and secondary organic compounds within the atmosphere of London; (d) generation of new data on the atmospheric structure and turbulence above London, including the estimation of mixed layer depths; (e) provision of new data on trace gas dispersion in the urban atmosphere through the release of purposeful tracers; (f) the determination of spatial differences in aerosol particle size distributions and their interpretation in terms of sources and physico-chemical transformations; (g) studies of the nocturnal oxidation of nitrogen oxides and of the diurnal behaviour of nitrate aerosol in the urban atmosphere, and (h) new information on the chemical composition and source apportionment of particulate matter size fractions in the atmosphere of London derived both from bulk chemical analysis and aerosol mass spectrometry with two instrument types.
Resumo:
Increasing optical depth poleward of 45° is a robust response to warming in global climate models. Much of this cloud optical depth increase has been hypothesized to be due to transitions from ice-dominated to liquid-dominated mixed-phase cloud. In this study, the importance of liquid-ice partitioning for the optical depth feedback is quantified for 19 Coupled Model Intercomparison Project Phase 5 models. All models show a monotonic partitioning of ice and liquid as a function of temperature, but the temperature at which ice and liquid are equally mixed (the glaciation temperature) varies by as much as 40 K across models. Models that have a higher glaciation temperature are found to have a smaller climatological liquid water path (LWP) and condensed water path and experience a larger increase in LWP as the climate warms. The ice-liquid partitioning curve of each model may be used to calculate the response of LWP to warming. It is found that the repartitioning between ice and liquid in a warming climate contributes at least 20% to 80% of the increase in LWP as the climate warms, depending on model. Intermodel differences in the climatological partitioning between ice and liquid are estimated to contribute at least 20% to the intermodel spread in the high-latitude LWP response in the mixed-phase region poleward of 45°S. It is hypothesized that a more thorough evaluation and constraint of global climate model mixed-phase cloud parameterizations and validation of the total condensate and ice-liquid apportionment against observations will yield a substantial reduction in model uncertainty in the high-latitude cloud response to warming.