223 resultados para provisioning
Resumo:
A key aspect underpinning life-history theory is the existence of trade-offs. Trade-offs occur because resources are limited, meaning that individuals cannot invest in all traits simultaneously, leading to costs for traits such as growth and reproduction. Such costs may be the reason for the sub-maximal growth rates that are often observed in nature, though the fitness consequences of these costs would depend on the effects on lifetime reproductive success. Recently, much attention has been given to the physiological mechanism that might underlie these life-history trade-offs, with oxidative stress (OS) playing a key role. OS is characterised by a build-up of oxidative damage to tissues (e.g. protein, lipids and DNA) from attack by reactive species (RS). RS, the majority of which are by-products of metabolism, are usually neutralised by antioxidants, however OS occurs when there is an imbalance between the two. There are two main theories linking OS with growth and reproduction. The first is that traits like growth and reproduction, being metabolically demanding, lead to an increase in RS production. The second involves the diversion of resources away from self-maintenance processes (e.g. the redox system) when individuals are faced with enhanced growth or reproductive expenditure. Previous research investigating trade-offs involving growth or reproduction and self-maintenance has been equivocal. One reason for this could be that associations among redox biomarkers can vary greatly so that the biomarker selected for analysis can influence the conclusion reached about an individual’s oxidative status. Therefore the first aim of my thesis was to explore the strength and pattern of integration of five biomarkers of OS (three antioxidants, one damage and one general oxidation measure) in wild blue tit (Cyanistes caeruleus) adults and nestlings (Chapter 2). In doing so, I established that all five biomarkers should be included in future analyses, thus using this collection of biomarkers I explored my next aims; whether enhanced growth (Chapters 3 and 4) or reproductive effort (Chapter 5) can lead to increased OS levels, if these traits are traded off against self-maintenance. I accomplished these aims using both a meta-analytic and experimental approach, the latter involving manipulation of brood size in wild blue tits in order to experimentally alter growth rate of nestlings and provisioning rate (a proxy for reproductive expenditure) of adults. I also investigated the potential for redox integration to be used as an index of body condition (Chapter 2), allowing predictions about future fitness consequences of changes to oxidative state to be made. A growth – self-maintenance trade off was supported by my meta-analytic results (Chapter 4) which found OS to be a constraint on growth. However, when faced with experimentally enhanced growth, animals were typically not able to adjust this trade-off so that oxidative damage resulted. This might support the idea that energetically expensive growth causes resources to be diverted away from the redox system; however, antioxidants did not show an overall reduction in response to growth in the meta-analysis suggesting that oxidative costs of growth may result from increased RS production due to the greater metabolism needed for enhanced growth. My experimental data (Chapter 3) showed a similar pattern, with raised protein damage levels (protein carbonyls; PCs) in the fastest growing blue tit chicks in a brood, compared with their slower growing sibs. These within-brood differences in OS levels likely resulted from within-brood hierarchies and might have masked any between-brood differences, which were not observed here. Despite evidence for a growth – self-maintenance trade off, my experimental results on blue tits found no support for the hypothesis that self-maintenance is also traded off against reproduction, another energetically demanding trait. There was no link between experimentally altered reproductive expenditure and OS, nor was there a direct correlation between reproductive effort and OS (Chapter 5). However, there are various factors that likely influence whether oxidative costs are observed, including environmental conditions and whether such costs are transient. This emphasises the need for longitudinal studies following the same individuals over multiple years and across a wide range of habitats that differ in quality. This would allow investigation into how key life events interact; it might be that raised OS levels from rapid early growth have the potential to constrain reproduction or that high parental OS levels constrain offspring growth. Any oxidative costs resulting from these life-history trade-offs have the potential to impact on future fitness. Redox integration of certain biomarkers might prove to be a useful tool in making predictions about fitness, as I found in Chapter 2, as well as establishing how the redox system responds, as a whole, to changes to growth and reproduction. Finally, if the tissues measured can tolerate a given level of OS, then the level of oxidative damage might be irrelevant and not impact on future fitness at all.
Resumo:
Wydział Nauk Geograficznych i Geologicznych
Resumo:
Contemporary African agricultural policy embodies the African Green Revolution’s drive towards modernisation and commercialisation. Agroecologists have criticised this movement on ecological, social and political grounds. Northern Ghanaian fertiliser credit schemes provide a good example through which these critiques can be examined in a context where agricultural policy reflects the African Green Revolution’s ideals. This study aimed to determine the relationship of such credit schemes to farmers’ use of organic amendments, elucidate other factors related to organic amendment use, and comment on the relevance of this modernisation policy and its relationship to agroecology. A first research phase employed semi-structured key informant interviews. Qualitative data from these informed construction of a semi-structured questionnaire that was used in a survey of 205 farmers. Multistage sampling purposively identified five villages and selected farmers within who had joined government and donor-funded fertiliser credit schemes. The use of organic and inorganic amendments was compared to that of peers who had not taken part in such schemes. Quantitative data were used in binomial logistic regression, inferential and descriptive statistics. Qualitative data were content analysed. Credit group membership was associated with higher fertiliser application and yield, but had little influence on the extent of commercialisation. Farmers who applied organic amendments were 40% less likely to belong to a fertiliser credit scheme than not, indicating substitution between organic and inorganic fertilisers. Organic amendments were 40% more likely to be applied to compound farms than outfields and six times more likely to be applied by household heads than other household members. However, household heads also preferentially joined credit groups. This was part of an agroecological soil fertility management strategy. Household heads appreciated the soil moisture retention properties of organic amendments, and applied them to compound farms to reduce risk to their household food supply in a semi-arid environment. They simultaneously accessed fertiliser to enhance this household provisioning strategy. They appreciated the increased yields this achieved, yet complained that the repayment terms of credit schemes were unfair, fertiliser did not enhance yields in dry conditions and fertilisers were supplied late. Farmers’ use of credited fertiliser alongside their existing agroecological strategy is helpful to the extent that it raises yields, yet is problematic in that it conflicts with risk-reduction strategies based on organics. There is some potential for modernised and agroecological management paradigms to coexist. For fertiliser credit to play a role in this, schemes must use fairer repayment terms and involve a focus on simultaneous use of organic amendments.
Resumo:
The eutrofization is a natural process of accumulation of nutrients in aquatic´s body that it has been accelerated for the human´s actives, mainly the related with the activities of camp, industrial and the inadequate disposition of the domestic sewage. The enrichment of the aquatic´s body with nutrients, mainly the nitrogen and the phosphorus, and the consequent proliferation of algae and Cyanobacteria can commit the quality of the water for the public provisioning, for the fish farming and for other ends. The problem becomes more critical when there is a shortage of water naturally as in the semi-arid area of the Brazilian northeast. Before that problem this work had as objective evaluates the trophic state of six reservoirs of the basin of River Seridó of Rio Grande of Norte and also estimate the capacity of load of match of the reservoir and risk probabilities based on the established limits by the resolution Conama 357/05. The results demonstrate that the six reservoirs are eutrofization, with concentration of total phosphorus and cloro a in the water upster to 50 e 12 μg l-1. The results show that space homogeneity exists in the state trophic of the reservoirs, but a significant variation interanual in function of the increase of the concentrations of nutrients and decrease of the transparency of the water with the reduction of the body of water accumulated in the reservoirs.The results of the simulation risk estocastic show that the reservoirs could receive annually from 72 to 216 Kg of P, assuming a risk of 10% of increasing in more than 30 μg l-1 the annual medium concentrations of total match in the water of these reservoirs. This load could be high in until 360 kg of P a year in case the managers assume a risk of 10% of increasing in more than 50 μg l-1 the annual medium concentrations of total phosphorus in the waters of these reservoirs
Resumo:
Modern data centers host hundreds of thousands of servers to achieve economies of scale. Such a huge number of servers create challenges for the data center network (DCN) to provide proportionally large bandwidth. In addition, the deployment of virtual machines (VMs) in data centers raises the requirements for efficient resource allocation and find-grained resource sharing. Further, the large number of servers and switches in the data center consume significant amounts of energy. Even though servers become more energy efficient with various energy saving techniques, DCN still accounts for 20% to 50% of the energy consumed by the entire data center. The objective of this dissertation is to enhance DCN performance as well as its energy efficiency by conducting optimizations on both host and network sides. First, as the DCN demands huge bisection bandwidth to interconnect all the servers, we propose a parallel packet switch (PPS) architecture that directly processes variable length packets without segmentation-and-reassembly (SAR). The proposed PPS achieves large bandwidth by combining switching capacities of multiple fabrics, and it further improves the switch throughput by avoiding padding bits in SAR. Second, since certain resource demands of the VM are bursty and demonstrate stochastic nature, to satisfy both deterministic and stochastic demands in VM placement, we propose the Max-Min Multidimensional Stochastic Bin Packing (M3SBP) algorithm. M3SBP calculates an equivalent deterministic value for the stochastic demands, and maximizes the minimum resource utilization ratio of each server. Third, to provide necessary traffic isolation for VMs that share the same physical network adapter, we propose the Flow-level Bandwidth Provisioning (FBP) algorithm. By reducing the flow scheduling problem to multiple stages of packet queuing problems, FBP guarantees the provisioned bandwidth and delay performance for each flow. Finally, while DCNs are typically provisioned with full bisection bandwidth, DCN traffic demonstrates fluctuating patterns, we propose a joint host-network optimization scheme to enhance the energy efficiency of DCNs during off-peak traffic hours. The proposed scheme utilizes a unified representation method that converts the VM placement problem to a routing problem and employs depth-first and best-fit search to find efficient paths for flows.
Resumo:
Value-Stream mapping (VSM) is a helpful tool to identify waste and improvement areas. It has emerged as a preferred way to support and implement the lean approach. While lean principles are well-established and have broad applicability in manufacturing, their extension to information technology is still limited. Based on a case study approach, this paper presents the implementation of VSM in an IT firm as a lean IT improvement initiative. It involves mapping the current activities of the firm and identifying opportunities for improvement. After several interviews with employees who are currently involved in the process, current state map is prepared to describe the existing problem areas. Future state map is prepared to show the proposed improvement action plans. The achievements of VSM implementation are reduction in lead time, cycle time and resources. Our finding indicates that, with the new process change, total lead time can be reduced from 20 days to 3 days – 92% reduction in overall lead time for database provisioning process.
Resumo:
El presente documento analiza los determinantes del margen de intermediación para el sistema financiero colombiano entre 1989 y 2003. Bajo una estimación dinámica de los efectos generados por variables específicas de actividad, impuestos y estructura de mercado, se presenta un seguimiento del margen de intermediación financiero, para un período que presenta elementos de liberalización y crisis.
Resumo:
Varias investigaciones sobre el café han enfatizado la dimensión económica e histórica para analizarlo. La estructura económica, los precios, la organización de las empresas de acuerdo a la reducción de costos, los procesos políticos y culturales en fin, han sido los articuladores de estos análisis. Sin embargo, el presente trabajo se enfatizó en el análisis de las organizaciones a partir de la antropología empresarial de Bourdieu para abordar el café en Neiva. Cómo las diferentes organizaciones de intermediación interactúan para establecer, garantizar y perpetuar su posición en el mercado del café.
Resumo:
Procambarus clarkii is currently recorded from 16 European territories. On top of being a vector of crayfish plague, which is responsible for large-scale disappearance of native crayfish species, it causes severe impacts on diverse aquatic ecosystems, due to its rapid life cycle, dispersal capacities, burrowing activities and high population densities. The species has even been recently discovered in caves. This invasive crayfish is a polytrophic keystone species that can exert multiple pressures on ecosystems. Most studies deal with the decline of macrophytes and predation on several species (amphibians, molluscs, and macroinvertebrates), highlighting how this biodiversity loss leads to unbalanced food chains. At a management level, the species is considered as (a) a devastating digger of the water drainage systems in southern and central Europe, (b) an agricultural pest in Mediterranean territories, consuming, for example, young rice plants, and (c) a threat to the restoration of water bodies in north-western Europe. Indeed, among the high-risk species, P. clarkii consistently attained the highest risk rating. Its negative impacts on ecosystem services were evaluated. These may include the loss of provisioning services such as reductions in valued edible native species of regulatory and supporting services, inducing wide changes in ecological communities and increased costs to agriculture and water management. Finally, cultural services may be lost. The species fulfils the criteria of the Article 4(3) of Regulation (EU) No 1143/2014 of the European Parliament (species widely spread in Europe and impossible to eradicate in a cost-effective manner) and has been included in the “Union List”. Particularly, awareness of the ornamental trade through the internet must be reinforced within the European Community and import and trade regulations should be imposed to reduce the availability of this high-risk species.
Resumo:
This thesis deals with optimization techniques and modeling of vehicular networks. Thanks to the models realized with the integer linear programming (ILP) and the heuristic ones, it was possible to study the performances in 5G networks for the vehicular. Thanks to Software-defined networking (SDN) and Network functions virtualization (NFV) paradigms it was possible to study the performances of different classes of service, such as the Ultra Reliable Low Latency Communications (URLLC) class and enhanced Mobile BroadBand (eMBB) class, and how the functional split can have positive effects on network resource management. Two different protection techniques have been studied: Shared Path Protection (SPP) and Dedicated Path Protection (DPP). Thanks to these different protections, it is possible to achieve different network reliability requirements, according to the needs of the end user. Finally, thanks to a simulator developed in Python, it was possible to study the dynamic allocation of resources in a 5G metro network. Through different provisioning algorithms and different dynamic resource management techniques, useful results have been obtained for understanding the needs in the vehicular networks that will exploit 5G. Finally, two models are shown for reconfiguring backup resources when using shared resource protection.
Resumo:
The application of modern ICT technologies is radically changing many fields pushing toward more open and dynamic value chains fostering the cooperation and integration of many connected partners, sensors, and devices. As a valuable example, the emerging Smart Tourism field derived from the application of ICT to Tourism so to create richer and more integrated experiences, making them more accessible and sustainable. From a technological viewpoint, a recurring challenge in these decentralized environments is the integration of heterogeneous services and data spanning multiple administrative domains, each possibly applying different security/privacy policies, device and process control mechanisms, service access, and provisioning schemes, etc. The distribution and heterogeneity of those sources exacerbate the complexity in the development of integrating solutions with consequent high effort and costs for partners seeking them. Taking a step towards addressing these issues, we propose APERTO, a decentralized and distributed architecture that aims at facilitating the blending of data and services. At its core, APERTO relies on APERTO FaaS, a Serverless platform allowing fast prototyping of the business logic, lowering the barrier of entry and development costs to newcomers, (zero) fine-grained scaling of resources servicing end-users, and reduced management overhead. APERTO FaaS infrastructure is based on asynchronous and transparent communications between the components of the architecture, allowing the development of optimized solutions that exploit the peculiarities of distributed and heterogeneous environments. In particular, APERTO addresses the provisioning of scalable and cost-efficient mechanisms targeting: i) function composition allowing the definition of complex workloads from simple, ready-to-use functions, enabling smarter management of complex tasks and improved multiplexing capabilities; ii) the creation of end-to-end differentiated QoS slices minimizing interfaces among application/service running on a shared infrastructure; i) an abstraction providing uniform and optimized access to heterogeneous data sources, iv) a decentralized approach for the verification of access rights to resources.
Resumo:
The main objective of my thesis work is to exploit the Google native and open-source platform Kubeflow, specifically using Kubeflow pipelines, to execute a Federated Learning scalable ML process in a 5G-like and simplified test architecture hosting a Kubernetes cluster and apply the largely adopted FedAVG algorithm and FedProx its optimization empowered by the ML platform ‘s abilities to ease the development and production cycle of this specific FL process. FL algorithms are more are and more promising and adopted both in Cloud application development and 5G communication enhancement through data coming from the monitoring of the underlying telco infrastructure and execution of training and data aggregation at edge nodes to optimize the global model of the algorithm ( that could be used for example for resource provisioning to reach an agreed QoS for the underlying network slice) and after a study and a research over the available papers and scientific articles related to FL with the help of the CTTC that suggests me to study and use Kubeflow to bear the algorithm we found out that this approach for the whole FL cycle deployment was not documented and may be interesting to investigate more in depth. This study may lead to prove the efficiency of the Kubeflow platform itself for this need of development of new FL algorithms that will support new Applications and especially test the FedAVG algorithm performances in a simulated client to cloud communication using a MNIST dataset for FL as benchmark.
Resumo:
This thesis seeks to analyse the performance of dynamic slice provisioning in a 5G metro network with the low latency and reliability guaranties. This elaborate highlight the comparison in terms of performance of two versions of a simulator developed in Python based on different models: the Exhaustive research model and Shortest Path First Fit (SPFF) model. It further presents the differences between the dedicated path protection and the shared path protection. This analysis is made through several simulations at different network conditions by varying networks resources and observing the network performances while comparing the 2 models mentioned above. A reconfiguration procedure was implemented on backup resources in the shortest path first fit in order to improve its performance with respect to the exhaustive research which is more optimised. Subsequently, several triggering events was implemented, for the reconfiguration. And a comparison is made between these different triggering events in terms blocking probability, bandwidth at link, capacity at each node, primary and backup bandwidth per slice and backup capacity per slice.