910 resultados para digital rights management
Resumo:
What municipal recycling rate is socially optimal? One credible answer would consider the recycling rate that minimizes the overall social costs of managing municipal waste. Such social costs are comprised of all budgetary costs and revenues associated with operating municipal waste and recycling programs, all costs to recycling households associated with preparing and storing recyclable materials for collection, all external disposal costs associated with waste disposed at landfills or incinerators, and all external benefits associated with the provision of recycled materials that foster environmentally efficient production processes. This paper discusses how to estimate these four components of social cost to then estimate the optimal recycling rate. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
This article introduces a new construct to the field of management called Psychological Sense of Community (PSOC). This is important because management scholars are calling for the creation of communities in organizations in an environment that lacks appropriate construct development. The aims of this article are threefold: (a) develop a working definition of PSOC via a review of the extant literature on PSOC from other disciplines with the goal of translating it into the domain of management, (b) synthesize findings from parallel literatures on the outcomes of PSOC with an eye toward exploring the relevance of such outcomes in management contexts, and (c) assess the value of PSOC as it relates to its uniqueness in relation to other prominent management constructs and its scope of applicability in a variety of management inquiry areas.
Resumo:
The primary objective of this thesis is to demonstrate the pernicious impact that moral hierarchies have on our perception and subsequent treatment of non-human animals. Moral hierarchies in general are characterized by a dynamic in which one group is considered to be fundamentally superior to a lesser group. This thesis focuses specifically on the moral hierarchies that arise when humans are assumed to be superior to non-human animals in virtue of their advanced mental capabilities. The operative hypothesis of this thesis is essentially that moral hierarchies thwart the provision of justice to non-human animals in that they function as a justification for otherwise impermissible actions. When humans are assumed to be fundamentally superior to non-human animals then it becomes morally permissible for humans to kill non-human animals and utilize them as mere instrumentalities. This thesis is driven primarily by an in-depth analysis of the approaches to animal rights that are provided by Peter Singer, Tom Regan, and Gary Francione. Each of these thinkers claim that they overcome anthropocentrism and provide approaches that preclude the establishment of a moral hierarchy. One of the major findings of this thesis, however, is that Singer and Regan offer approaches that remain highly anthropocentric despite the fact that each thinker claims that they have overcome anthropocentrism. The anthropocentrism persists in these respective approaches in that each thinkers gives humans Regan and Singer have different conceptions of the criteria that are required to afford a being moral worth, but they both give preference to beings that have the cognitive ability to form desires regarding the future.. As a result, a moral hierarchy emerges in which humans are regarded to be fundamentally superior. Francione, however, provides an approach that does not foster a moral hierarchy. Francione creates such an approach by applying the principle of equal consideration of interests in a consistent manner. Moreover, Francione argues that mere sentience is both a necessary and sufficient condition for being eligible and subsequently receiving moral consideration. The upshot of this thesis is essentially that the moral treatment of animals is not compatible with the presence of a moral hierarchy. As a result, this thesis demonstrates that future approaches to animal rights must avoid the establishment of moral hierarchies. The research and analysis within this thesis demonstrates that this is not a possibility, however, unless all theories of justice that are to accommodate animals abandon the notion that cognition matters morally.
Resumo:
This paper estimates the average social cost of municipal waste management as a function of the recycling rate. Social costs include all municipal costs and revenues, costs to recycling households to prepare materials estimated with an original method, external disposal costs, and external recycling benefits. Results suggest average social costs are minimized with recycling rates well below observed and mandated levels in Japan. Cost-minimizing municipalities are estimated to recycle less than the optimal rate. These results are robust to changes in the components of social costs, indicating that Japan and perhaps other developed countries may be setting inefficiently high recycling goals. (C) 2014 Elsevier Inc. All rights reserved.
Resumo:
Recent legislative and regulatory developments have focused attention on older adults' capacity for involvement in health care decision-making. The Omnibus Budget Reconciliation Act of 1987 (OBRA 87) focused attention on the rights of nursing home residents to be involved in health care decision-making to the fullest extent possible. This article uses data from the 1987 National Medical Expenditure Survey (NMES) to examine rates of incapacity for health care decision-making among nursing home residents. Elements of the Oklahoma statute were used to operationalize decision-making incapacity: disability or disorder, difficulty in decision-making or communicating decisions, and functional disability. Fifty-three percent of nursing home residents had a combination of either physical or mental impairment and an impairment in either self-care or money management. The discussion focuses on the policy and practice implications of significant rates of incapacity among nursing home residents.
Resumo:
This paper will explore re-framing historic atrocity and its relationship to Holocaust and Genocide education. The origins of genocide studies and its links to Holocaust studies will be traced to discuss the impact of new scholarship and framings on genocide education in the classroom.
Resumo:
When patients enter our emergency room with suspected multiple injuries, Statscan provides a full body anterior and lateral image for initial diagnosis, and then zooms in on specific smaller areas for a more detailed evaluation. In order to examine the possible role of Statscan in the management of multiply injured patients we implemented a modified ATLS((R)) algorithm, where X-ray of C-spine, chest and pelvis have been replaced by single-total a.p./lat. body radiograph. Between 15 October 2006 and 1 February 2007 143 trauma patients (mean ISS 15+/-14 (3-75)) were included. We compared the time in resuscitation room to 650 patients (mean ISS 14+/-14 (3-75)) which were treated between 1 January 2002 and 1 January 2004 according to conventional ATLS protocol. The total-body scanning time was 3.5 min (3-6 min) compared to 25.7 (8-48 min) for conventional X-rays, The total ER time was unchanged 28.7 min (13-58 min) compared to 29.1 min (15-65 min) using conventional plain radiography. In 116/143 patients additional CT scans were necessary. In 98/116 full body trauma CT scans were performed. In 18/116 patients selective CT scans were ordered based on Statscan findings. In 43/143 additional conventional X-rays had to be performed, mainly due to inadequate a.p. views of fractured bones. All radiographs were transmitted over the hospital network (Picture Archiving and Communication System, PACS) for immediate simultaneous viewing at different places. The rapid availability of images for interpretation because of their digital nature and the reduced need for repeat exposures because of faulty radiography are also felt to be strengths.
Resumo:
In 2005, Wetland Studies and Solutions, Inc. (WSSI) installed an extensive Low Impact Development (LID) stormwater management system on their new office site in Gainesville, Virginia. The 4-acre site is serviced by a network of LID components: permeable pavements (two proprietary and one gravel type), bioretention cell / rain garden, green roof, vegetated swale, rainwater harvesting and drip irrigation, and slow-release underground detention. The site consists of heavy clay soils, and the LID components are mostly integrated by a series of underdrain pipes. A comprehensive monitoring system has been designed and installed to measure hydrologic performance throughout the LID, underdrained network. The monitoring system measures flows into and out of each LID component independently while concurrently monitoring rainfall events. A sensitivity analysis and laboratory calibration has been performed on the flow measurement system. Field data has been evaluated to determine the hydrologic performance of the LID features. Finally, hydrologic models amenable to compact, underdrained LID sites have been reviewed and recommended for future modeling and design.
Resumo:
Metals price risk management is a key issue related to financial risk in metal markets because of uncertainty of commodity price fluctuation, exchange rate, interest rate changes and huge price risk either to metals’ producers or consumers. Thus, it has been taken into account by all participants in metal markets including metals’ producers, consumers, merchants, banks, investment funds, speculators, traders and so on. Managing price risk provides stable income for both metals’ producers and consumers, so it increases the chance that a firm will invest in attractive projects. The purpose of this research is to evaluate risk management strategies in the copper market. The main tools and strategies of price risk management are hedging and other derivatives such as futures contracts, swaps and options contracts. Hedging is a transaction designed to reduce or eliminate price risk. Derivatives are financial instruments, whose returns are derived from other financial instruments and they are commonly used for managing financial risks. Although derivatives have been around in some form for centuries, their growth has accelerated rapidly during the last 20 years. Nowadays, they are widely used by financial institutions, corporations, professional investors, and individuals. This project is focused on the over-the-counter (OTC) market and its products such as exotic options, particularly Asian options. The first part of the project is a description of basic derivatives and risk management strategies. In addition, this part discusses basic concepts of spot and futures (forward) markets, benefits and costs of risk management and risks and rewards of positions in the derivative markets. The second part considers valuations of commodity derivatives. In this part, the options pricing model DerivaGem is applied to Asian call and put options on London Metal Exchange (LME) copper because it is important to understand how Asian options are valued and to compare theoretical values of the options with their market observed values. Predicting future trends of copper prices is important and would be essential to manage market price risk successfully. Therefore, the third part is a discussion about econometric commodity models. Based on this literature review, the fourth part of the project reports the construction and testing of an econometric model designed to forecast the monthly average price of copper on the LME. More specifically, this part aims at showing how LME copper prices can be explained by means of a simultaneous equation structural model (two-stage least squares regression) connecting supply and demand variables. A simultaneous econometric model for the copper industry is built: {█(Q_t^D=e^((-5.0485))∙P_((t-1))^((-0.1868) )∙〖GDP〗_t^((1.7151) )∙e^((0.0158)∙〖IP〗_t ) @Q_t^S=e^((-3.0785))∙P_((t-1))^((0.5960))∙T_t^((0.1408))∙P_(OIL(t))^((-0.1559))∙〖USDI〗_t^((1.2432))∙〖LIBOR〗_((t-6))^((-0.0561))@Q_t^D=Q_t^S )┤ P_((t-1))^CU=e^((-2.5165))∙〖GDP〗_t^((2.1910))∙e^((0.0202)∙〖IP〗_t )∙T_t^((-0.1799))∙P_(OIL(t))^((0.1991))∙〖USDI〗_t^((-1.5881))∙〖LIBOR〗_((t-6))^((0.0717) Where, Q_t^D and Q_t^Sare world demand for and supply of copper at time t respectively. P(t-1) is the lagged price of copper, which is the focus of the analysis in this part. GDPt is world gross domestic product at time t, which represents aggregate economic activity. In addition, industrial production should be considered here, so the global industrial production growth that is noted as IPt is included in the model. Tt is the time variable, which is a useful proxy for technological change. A proxy variable for the cost of energy in producing copper is the price of oil at time t, which is noted as POIL(t ) . USDIt is the U.S. dollar index variable at time t, which is an important variable for explaining the copper supply and copper prices. At last, LIBOR(t-6) is the 6-month lagged 1-year London Inter bank offering rate of interest. Although, the model can be applicable for different base metals' industries, the omitted exogenous variables such as the price of substitute or a combined variable related to the price of substitutes have not been considered in this study. Based on this econometric model and using a Monte-Carlo simulation analysis, the probabilities that the monthly average copper prices in 2006 and 2007 will be greater than specific strike price of an option are defined. The final part evaluates risk management strategies including options strategies, metal swaps and simple options in relation to the simulation results. The basic options strategies such as bull spreads, bear spreads and butterfly spreads, which are created by using both call and put options in 2006 and 2007 are evaluated. Consequently, each risk management strategy in 2006 and 2007 is analyzed based on the day of data and the price prediction model. As a result, applications stemming from this project include valuing Asian options, developing a copper price prediction model, forecasting and planning, and decision making for price risk management in the copper market.
Resumo:
Northern hardwood management was assessed throughout the state of Michigan using data collected on recently harvested stands in 2010 and 2011. Methods of forensic estimation of diameter at breast height were compared and an ideal, localized equation form was selected for use in reconstructing pre-harvest stand structures. Comparisons showed differences in predictive ability among available equation forms which led to substantial financial differences when used to estimate the value of removed timber. Management on all stands was then compared among state, private, and corporate landowners. Comparisons of harvest intensities against a liberal interpretation of a well-established management guideline showed that approximately one third of harvests were conducted in a manner which may imply that the guideline was followed. One third showed higher levels of removals than recommended, and one third of harvests were less intensive than recommended. Multiple management guidelines and postulated objectives were then synthesized into a novel system of harvest taxonomy, against which all harvests were compared. This further comparison showed approximately the same proportions of harvests, while distinguishing sanitation cuts and the future productive potential of harvests cut more intensely than suggested by guidelines. Stand structures are commonly represented using diameter distributions. Parametric and nonparametric techniques for describing diameter distributions were employed on pre-harvest and post-harvest data. A common polynomial regression procedure was found to be highly sensitive to the method of histogram construction which provides the data points for the regression. The discriminative ability of kernel density estimation was substantially different from that of the polynomial regression technique.
Resumo:
In distribution system operations, dispatchers at control center closely monitor system operating limits to ensure system reliability and adequacy. This reliability is partly due to the provision of remote controllable tie and sectionalizing switches. While the stochastic nature of wind generation can impact the level of wind energy penetration in the network, an estimate of the output from wind on hourly basis can be extremely useful. Under any operating conditions, the switching actions require human intervention and can be an extremely stressful task. Currently, handling a set of switching combinations with the uncertainty of distributed wind generation as part of the decision variables has been nonexistent. This thesis proposes a three-fold online management framework: (1) prediction of wind speed, (2) estimation of wind generation capacity, and (3) enumeration of feasible switching combinations. The proposed methodology is evaluated on 29-node test system with 8 remote controllable switches and two wind farms of 18MW and 9MW nameplate capacities respectively for generating the sequence of system reconfiguration states during normal and emergency conditions.
Resumo:
The challenges posed by global climate change are motivating the investigation of strategies that can reduce the life cycle greenhouse gas (GHG) emissions of products and processes. While new construction materials and technologies have received significant attention, there has been limited emphasis on understanding how construction processes can be best managed to reduce GHG emissions. Unexpected disruptive events tend to adversely impact construction costs and delay project completion. They also tend to increase project GHG emissions. The objective of this paper is to investigate ways in which project GHG emissions can be reduced by appropriate management of disruptive events. First, an empirical analysis of construction data from a specific highway construction project is used to illustrate the impact of unexpected schedule delays in increasing project GHG emissions. Next, a simulation based methodology is described to assess the effectiveness of alternative project management strategies in reducing GHG emissions. The contribution of this paper is that it explicitly considers projects emissions, in addition to cost and project duration, in developing project management strategies. Practical application of the method discussed in this paper will help construction firms reduce their project emissions through strategic project management, and without significant investment in new technology. In effect, this paper lays the foundation for best practices in construction management that will optimize project cost and duration, while minimizing GHG emissions.
Resumo:
Information management is a key aspect of successful construction projects. Having inaccurate measurements and conflicting data can lead to costly mistakes, and vague quantities can ruin estimates and schedules. Building information modeling (BIM) augments a 3D model with a wide variety of information, which reduces many sources of error and can detect conflicts before they occur. Because new technology is often more complex, it can be difficult to effectively integrate it with existing business practices. In this paper, we will answer two questions: How can BIM add value to construction projects? and What lessons can be learned from other companies that use BIM or other similar technology? Previous research focused on the technology as if it were simply a tool, observing problems that occurred while integrating new technology into existing practices. Our research instead looks at the flow of information through a company and its network, seeing all the actors as part of an ecosystem. Building upon this idea, we proposed the metaphor of an information supply chain to illustrate how BIM can add value to a construction project. This paper then concludes with two case studies. The first case study illustrates a failure in the flow of information that could have prevented by using BIM. The second case study profiles a leading design firm that has used BIM products for many years and shows the real benefits of using this program.