952 resultados para Capacity Management
Resumo:
Transient power dissipation profiles in handheld electronic devices alternate between high and low power states depending on usage. Capacitive thermal management based on phase change materials potentially offers a fan-less thermal management for such transient profiles. However, such capacitive management becomes feasible only if there is a significant enhancement in the enthalpy change per unit volume of the phase change material since existing bulk materials such as paraffin fall short of requirements. In this thesis I propose novel nanostructured thin-film materials that can potentially exhibit significantly enhanced volumetric enthalpy change. Using fundamental thermodynamics of phase transition, calculations regarding the enhancement resulting from superheating in such thin film systems is conducted. Furthermore design of a microfabricated calorimeter to measure such enhancements is explained in detail. This work advances the state-of-art of phase change materials for capacitive cooling of handheld devices.
Resumo:
A number of fish species once native only to Lakes Victoria and Kyoga have considerably declined over the years, and in some cases disappeared, due to over exploitation, introduction of exotic species especially the Nile Perch, and environmental degradation resulting from human activities. Some of the species have been observed to survive in satellite lakes in the Victoria and Kyoga Lake basins. The Nabugabo satellite lakes contain the endemic Cichlid fish species, Oreochromis esculentus and two haplochromine species previously found only in Lake Nabugabo. There is, therefore, need to conserve these species by ensuring sustainable use and management of the resources. The study revealed that the Nabugabo lakes provide a range of socio-economic benefits accruing from fishing, farming, logging, resort beach development and watering of animals. However, although these activities impact on the lakes ecosystems, the participation of resource users in management is limited because of the weak local management institutions operating on the lakes, hence the need to strengthen them through capacity building. It is recommended that Government should work jointly with the beach committees and fishing community in a participatory way to eliminate the use of destructive fishing practices and control the other environment degrading activities.
Resumo:
El presente trabajo mide, en un contexto colaborativo, un conjunto de mecanismos relacionales interconectados y su incidencia en el performance organizacional. Apoyado en la Visión Relacional (Dyer y Singh, 1998), el trabajo integra mecanismos de las capacidades relacionales y capacidades competitivas en el concepto denominado ¿capacidades interorganizativas¿. Las capacidades interorganizativas se componen de la ¿confianza¿ y los mecanismos ¿combinar recursos complementarios¿, ¿invertir en activos específicos¿, ¿compartir información¿ y ¿solucionar problemas conjuntamente¿. La medición tiene lugar en la Industria Alimentaria (IA) española en un contexto de Supply Chain Management1 en red. Además de mejorar el conocimiento sobre las relaciones interorganizativas, el trabajo también realiza una revisión teórica sobre el sector agroalimentario en general y la IA en particular. Los resultados permiten confirmar, con excepciones, el constructo teórico de las ¿capacidades interorganizativas en un contexto de Supply Chain Management en red¿. Otros resultados obtenidos son los altos niveles de confianza en las relaciones entre socios y una industria que, pese al entorno competitivo y complejo actual, presenta niveles competitivos en términos de flexibilidad, respuesta, calidad y eficiencia. De igual modo, la investigación descubre la implantación de la capacidad de las empresas de la industria para gestionar el conocimiento. En este contexto, los resultados parecen indicar una correcta integración de la cadena de suministros de la IA que confirman la efectiva aplicación de la Supply Chain Management. Otras conclusiones obtenidas son la dependencia que ha tenido y tiene la industria alimentaria española de la distribución, intensificada en parte por la expansión de las marcas del distribuidor (MDDs).
Resumo:
Beef businesses in northern Australia are facing increased pressure to be productive and profitable with challenges such as climate variability and poor financial performance over the past decade. Declining terms of trade, limited recent gains in on-farm productivity, low profit margins under current management systems and current climatic conditions will leave little capacity for businesses to absorb climate change-induced losses. In order to generate a whole-of-business focus towards management change, the Climate Clever Beef project in the Maranoa-Balonne region of Queensland trialled the use of business analysis with beef producers to improve financial literacy, provide a greater understanding of current business performance and initiate changes to current management practices. Demonstration properties were engaged and a systematic approach was used to assess current business performance, evaluate impacts of management changes on the business and to trial practices and promote successful outcomes to the wider industry. Focus was concentrated on improving financial literacy skills, understanding the business’ key performance indicators and modifying practices to improve both business productivity and profitability. To best achieve the desired outcomes, several extension models were employed: the ‘group facilitation/empowerment model’, the ‘individual consultant/mentor model’ and the ‘technology development model’. Providing producers with a whole-of-business approach and using business analysis in conjunction with on-farm trials and various extension methods proved to be a successful way to encourage producers in the region to adopt new practices into their business, in the areas of greatest impact. The areas targeted for development within businesses generally led to improvements in animal performance and grazing land management further improving the prospects for climate resilience.
Resumo:
The report of the proceedings of the New Delhi workshop on the SSF Guidelines (Voluntary Guidelines for Securing Sustainable Small-scale Fisheries in the Context of Food Security and Poverty Eradication). The workshop brought together 95 participants from 13 states representing civil society organizations. governments, FAO, and fishworker organizations from both the marine and inland fisheries sectors. This report will be found useful for fishworker organizations, researchers, policy makers, members of civil society and anyone interested in small-scale fisheries, tenure rights, social development, livelihoods, post harvest and trade and disasters and climate change.
Inbound logistics, the last mile and intermodal high capacity transport – the case of Jula in Sweden
Resumo:
Some of the biggest challenges for intermodal transport competitiveness are the extra handling costs and pre- and post-haulage costs. This paper investigates the use of Intermodal High Capacity Transport (IHCT) for the intermodal transport chain in general and to pre-and post-haulage in particular. The aim is not only to measure the cost reductions from using larger vehicles but to understand how better management of inbound flows through increased integration of logistics processes can increase the efficiency of the last mile. The paper analyses the haulage of two 40 foot containers simultaneously when part of an intermodal transport chain. Data were collected from a demonstration project in Sweden, where permission was obtained to use longer vehicles on an approved route to and from the nearest intermodal terminal. Results indicate substantial cost savings from using longer vehicles for pre- and post-haulage. In addition, the business model whereby the shipper purchased their own chassis and permission was obtained to access the terminal after hours for collecting pre-loaded chassis brought additional cost and planning benefits. The total cost saving was significant and potentially eliminates the cost deficit associated with the last mile.
Resumo:
Some of the biggest challenges for intermodal transport competitiveness are the extra handling costs and pre- and post-haulage costs. This paper investigates the use of Intermodal High Capacity Transport (IHCT) for the intermodal transport chain in general and to pre-and post-haulage in particular. The aim is not only to measure the cost reductions from using larger vehicles but to understand how better management of inbound flows through increased integration of logistics processes can increase the efficiency of the last mile. The paper analyses the haulage of two 40 foot containers simultaneously when part of an intermodal transport chain. Data were collected from a demonstration project in Sweden, where permission was obtained to use longer vehicles on an approved route to and from the nearest intermodal terminal. Results indicate substantial cost savings from using longer vehicles for pre- and post-haulage. In addition, the business model whereby the shipper purchased their own chassis and permission was obtained to access the terminal after hours for collecting pre-loaded chassis brought additional cost and planning benefits. The total cost saving was significant and potentially eliminates the cost deficit associated with the last mile.
Resumo:
The anticipated growth of air traffic worldwide requires enhanced Air Traffic Management (ATM) technologies and procedures to increase the system capacity, efficiency, and resilience, while reducing environmental impact and maintaining operational safety. To deal with these challenges, new automation and information exchange capabilities are being developed through different modernisation initiatives toward a new global operational concept called Trajectory Based Operations (TBO), in which aircraft trajectory information becomes the cornerstone of advanced ATM applications. This transformation will lead to higher levels of system complexity requiring enhanced Decision Support Tools (DST) to aid humans in the decision making processes. These will rely on accurate predicted aircraft trajectories, provided by advanced Trajectory Predictors (TP). The trajectory prediction process is subject to stochastic effects that introduce uncertainty into the predictions. Regardless of the assumptions that define the aircraft motion model underpinning the TP, deviations between predicted and actual trajectories are unavoidable. This thesis proposes an innovative method to characterise the uncertainty associated with a trajectory prediction based on the mathematical theory of Polynomial Chaos Expansions (PCE). Assuming univariate PCEs of the trajectory prediction inputs, the method describes how to generate multivariate PCEs of the prediction outputs that quantify their associated uncertainty. Arbitrary PCE (aPCE) was chosen because it allows a higher degree of flexibility to model input uncertainty. The obtained polynomial description can be used in subsequent prediction sensitivity analyses thanks to the relationship between polynomial coefficients and Sobol indices. The Sobol indices enable ranking the input parameters according to their influence on trajectory prediction uncertainty. The applicability of the aPCE-based uncertainty quantification detailed herein is analysed through a study case. This study case represents a typical aircraft trajectory prediction problem in ATM, in which uncertain parameters regarding aircraft performance, aircraft intent description, weather forecast, and initial conditions are considered simultaneously. Numerical results are compared to those obtained from a Monte Carlo simulation, demonstrating the advantages of the proposed method. The thesis includes two examples of DSTs (Demand and Capacity Balancing tool, and Arrival Manager) to illustrate the potential benefits of exploiting the proposed uncertainty quantification method.
Resumo:
Maintaining accessibility to and understanding of digital information over time is a complex challenge that often requires contributions and interventions from a variety of individuals and organizations. The processes of preservation planning and evaluation are fundamentally implicit and share similar complexity. Both demand comprehensive knowledge and understanding of every aspect of to-be-preserved content and the contexts within which preservation is undertaken. Consequently, means are required for the identification, documentation and association of those properties of data, representation and management mechanisms that in combination lend value, facilitate interaction and influence the preservation process. These properties may be almost limitless in terms of diversity, but are integral to the establishment of classes of risk exposure, and the planning and deployment of appropriate preservation strategies. We explore several research objectives within the course of this thesis. Our main objective is the conception of an ontology for risk management of digital collections. Incorporated within this are our aims to survey the contexts within which preservation has been undertaken successfully, the development of an appropriate methodology for risk management, the evaluation of existing preservation evaluation approaches and metrics, the structuring of best practice knowledge and lastly the demonstration of a range of tools that utilise our findings. We describe a mixed methodology that uses interview and survey, extensive content analysis, practical case study and iterative software and ontology development. We build on a robust foundation, the development of the Digital Repository Audit Method Based on Risk Assessment. We summarise the extent of the challenge facing the digital preservation community (and by extension users and creators of digital materials from many disciplines and operational contexts) and present the case for a comprehensive and extensible knowledge base of best practice. These challenges are manifested in the scale of data growth, the increasing complexity and the increasing onus on communities with no formal training to offer assurances of data management and sustainability. These collectively imply a challenge that demands an intuitive and adaptable means of evaluating digital preservation efforts. The need for individuals and organisations to validate the legitimacy of their own efforts is particularly prioritised. We introduce our approach, based on risk management. Risk is an expression of the likelihood of a negative outcome, and an expression of the impact of such an occurrence. We describe how risk management may be considered synonymous with preservation activity, a persistent effort to negate the dangers posed to information availability, usability and sustainability. Risk can be characterised according to associated goals, activities, responsibilities and policies in terms of both their manifestation and mitigation. They have the capacity to be deconstructed into their atomic units and responsibility for their resolution delegated appropriately. We continue to describe how the manifestation of risks typically spans an entire organisational environment, and as the focus of our analysis risk safeguards against omissions that may occur when pursuing functional, departmental or role-based assessment. We discuss the importance of relating risk-factors, through the risks themselves or associated system elements. To do so will yield the preservation best-practice knowledge base that is conspicuously lacking within the international digital preservation community. We present as research outcomes an encapsulation of preservation practice (and explicitly defined best practice) as a series of case studies, in turn distilled into atomic, related information elements. We conduct our analyses in the formal evaluation of memory institutions in the UK, US and continental Europe. Furthermore we showcase a series of applications that use the fruits of this research as their intellectual foundation. Finally we document our results in a range of technical reports and conference and journal articles. We present evidence of preservation approaches and infrastructures from a series of case studies conducted in a range of international preservation environments. We then aggregate this into a linked data structure entitled PORRO, an ontology relating preservation repository, object and risk characteristics, intended to support preservation decision making and evaluation. The methodology leading to this ontology is outlined, and lessons are exposed by revisiting legacy studies and exposing the resource and associated applications to evaluation by the digital preservation community.
Resumo:
This dissertation investigates customer behavior modeling in service outsourcing and revenue management in the service sector (i.e., airline and hotel industries). In particular, it focuses on a common theme of improving firms’ strategic decisions through the understanding of customer preferences. Decisions concerning degrees of outsourcing, such as firms’ capacity choices, are important to performance outcomes. These choices are especially important in high-customer-contact services (e.g., airline industry) because of the characteristics of services: simultaneity of consumption and production, and intangibility and perishability of the offering. Essay 1 estimates how outsourcing affects customer choices and market share in the airline industry, and consequently the revenue implications from outsourcing. However, outsourcing decisions are typically endogenous. A firm may choose whether to outsource or not based on what a firm expects to be the best outcome. Essay 2 contributes to the literature by proposing a structural model which could capture a firm’s profit-maximizing decision-making behavior in a market. This makes possible the prediction of consequences (i.e., performance outcomes) of future strategic moves. Another emerging area in service operations management is revenue management. Choice-based revenue systems incorporate discrete choice models into traditional revenue management algorithms. To successfully implement a choice-based revenue system, it is necessary to estimate customer preferences as a valid input to optimization algorithms. The third essay investigates how to estimate customer preferences when part of the market is consistently unobserved. This issue is especially prominent in choice-based revenue management systems. Normally a firm only has its own observed purchases, while those customers who purchase from competitors or do not make purchases are unobserved. Most current estimation procedures depend on unrealistic assumptions about customer arriving. This study proposes a new estimation methodology, which does not require any prior knowledge about the customer arrival process and allows for arbitrary demand distributions. Compared with previous methods, this model performs superior when the true demand is highly variable.
Resumo:
Given that landfills are depletable and replaceable resources, the right approach, when dealing with landfill management, is that of designing an optimal sequence of landfills rather than designing every single landfill separately. In this paper we use Optimal Control models, with mixed elements of both continuous and discrete time problems, to determine an optimal sequence of landfills, as regarding their capacity and lifetime. The resulting optimization problems involve splitting a time horizon of planning into several subintervals, the length of which has to be decided. In each of the subintervals some costs, the amount of which depends on the value of the decision variables, have to be borne. The obtained results may be applied to other economic problems such as private and public investments, consumption decisions on durable goods, etc.
Resumo:
Persistent daily congestion has been increasing in recent years, particularly along major corridors during selected periods in the mornings and evenings. On certain segments, these roadways are often at or near capacity. However, a conventional Predefined control strategy did not fit the demands that changed over time, making it necessary to implement the various dynamical lane management strategies discussed in this thesis. Those strategies include hard shoulder running, reversible HOV lanes, dynamic tolls and variable speed limit. A mesoscopic agent-based DTA model is used to simulate different strategies and scenarios. From the analyses, all strategies aim to mitigate congestion in terms of the average speed and average density. The largest improvement can be found in hard shoulder running and reversible HOV lanes while the other two provide more stable traffic. In terms of average speed and travel time, hard shoulder running is the most congested strategy for I-270 to help relieve the traffic pressure.
Resumo:
The concept of social carrying capacity, though opens to debate and critique, is a valuable tool that enhances the management of recreational use in protected natural areas. In this study, conducted in Sierra de las Nieves natural park (Spain), we first categorised the hikers making use of the park and then, from the profiles obtained, analysed their perception of crowding on the trails. This assessment was subsequently used to assess levels of user satisfaction and thus to determine the psychosocial carrying capacity of the park. The results obtained can be extrapolated to most of the Spanish natural parks in Mediterranean mountain areas, due to their comparable levels of visitor numbers and to the prevalence of recreational hiking use. The results suggest that management efforts should be directed toward relocating trails outside the core areas, such that user preferences may be satisfied while less impact is made on the areas of highest environmental value.
Resumo:
By employing interpretive policy analysis this thesis aims to assess, measure, and explain policy capacity for government and non-government organizations involved in reclaiming Alberta's oil sands. Using this type of analysis to assess policy capacity is a novel approach for understanding reclamation policy; and therefore, this research will provide a unique contribution to the literature surrounding reclamation policy. The oil sands region in northeast Alberta, Canada is an area of interest for a few reasons; primarily because of the vast reserves of bitumen and the environmental cost associated with developing this resource. An increase in global oil demand has established incentive for industry to seek out and develop new reserves. Alberta's oil sands are one of the largest remaining reserves in the world, and there is significant interest in increasing production in this region. Furthermore, tensions in several oil exporting nations in the Middle East remain unresolved, and this has garnered additional support for a supply side solution to North American oil demands. This solution relies upon the development of reserves in both the United States and Canada. These compounding factors have contributed to the increased development in the oil sands of northeastern Alberta. Essentially, a rapid expansion of oil sands operations is ongoing, and is the source of significant disturbance across the region. This disturbance, and the promises of reclamation, is a source of contentious debates amongst stakeholders and continues to be highly visible in the media. If oil sands operations are to retain their social license to operate, it is critical that reclamation efforts be effective. One concern non-governmental organizations (NGOs) expressed criticizes the current monitoring and enforcement of regulatory programs in the oil sands. Alberta's NGOs have suggested the data made available to them originates from industrial sources, and is generally unchecked by government. In an effort to discern the overall status of reclamation in the oil sands this study explores several factors essential to policy capacity: work environment, training, employee attitudes, perceived capacity, policy tools, evidence based work, and networking. Data was collected through key informant interviews with senior policy professionals in government and non-government agencies in Alberta. The following are agencies of interest in this research: Canadian Association of Petroleum Producers (CAPP); Alberta Environment and Sustainable Resource Development (AESRD); Alberta Energy Regulator (AER); Cumulative Environmental Management Association (CEMA); Alberta Environment Monitoring, Evaluation, and Reporting Agency (AEMERA); Wood Buffalo Environmental Association (WBEA). The aim of this research is to explain how and why reclamation policy is conducted in Alberta's oil sands. This will illuminate government capacity, NGO capacity, and the interaction of these two agency typologies. In addition to answering research questions, another goal of this project is to show interpretive analysis of policy capacity can be used to measure and predict policy effectiveness. The oil sands of Alberta will be the focus of this project, however, future projects could focus on any government policy scenario utilizing evidence-based approaches.
Resumo:
The rapid growth of virtualized data centers and cloud hosting services is making the management of physical resources such as CPU, memory, and I/O bandwidth in data center servers increasingly important. Server management now involves dealing with multiple dissimilar applications with varying Service-Level-Agreements (SLAs) and multiple resource dimensions. The multiplicity and diversity of resources and applications are rendering administrative tasks more complex and challenging. This thesis aimed to develop a framework and techniques that would help substantially reduce data center management complexity. We specifically addressed two crucial data center operations. First, we precisely estimated capacity requirements of client virtual machines (VMs) while renting server space in cloud environment. Second, we proposed a systematic process to efficiently allocate physical resources to hosted VMs in a data center. To realize these dual objectives, accurately capturing the effects of resource allocations on application performance is vital. The benefits of accurate application performance modeling are multifold. Cloud users can size their VMs appropriately and pay only for the resources that they need; service providers can also offer a new charging model based on the VMs performance instead of their configured sizes. As a result, clients will pay exactly for the performance they are actually experiencing; on the other hand, administrators will be able to maximize their total revenue by utilizing application performance models and SLAs. This thesis made the following contributions. First, we identified resource control parameters crucial for distributing physical resources and characterizing contention for virtualized applications in a shared hosting environment. Second, we explored several modeling techniques and confirmed the suitability of two machine learning tools, Artificial Neural Network and Support Vector Machine, to accurately model the performance of virtualized applications. Moreover, we suggested and evaluated modeling optimizations necessary to improve prediction accuracy when using these modeling tools. Third, we presented an approach to optimal VM sizing by employing the performance models we created. Finally, we proposed a revenue-driven resource allocation algorithm which maximizes the SLA-generated revenue for a data center.