967 resultados para Kleisli Category
Resumo:
Category hierarchy is an abstraction mechanism for efficiently managing large-scale resources. In an open environment, a category hierarchy will inevitably become inappropriate for managing resources that constantly change with unpredictable pattern. An inappropriate category hierarchy will mislead the management of resources. The increasing dynamicity and scale of online resources increase the requirement of automatically maintaining category hierarchy. Previous studies about category hierarchy mainly focus on either the generation of category hierarchy or the classification of resources under a pre-defined category hierarchy. The automatic maintenance of category hierarchy has been neglected. Making abstraction among categories and measuring the similarity between categories are two basic behaviours to generate a category hierarchy. Humans are good at making abstraction but limited in ability to calculate the similarities between large-scale resources. Computing models are good at calculating the similarities between large-scale resources but limited in ability to make abstraction. To take both advantages of human view and computing ability, this paper proposes a two-phase approach to automatically maintaining category hierarchy within two scales by detecting the internal pattern change of categories. The global phase clusters resources to generate a reference category hierarchy and gets similarity between categories to detect inappropriate categories in the initial category hierarchy. The accuracy of the clustering approaches in generating category hierarchy determines the rationality of the global maintenance. The local phase detects topical changes and then adjusts inappropriate categories with three local operations. The global phase can quickly target inappropriate categories top-down and carry out cross-branch adjustment, which can also accelerate the local-phase adjustments. The local phase detects and adjusts the local-range inappropriate categories that are not adjusted in the global phase. By incorporating the two complementary phase adjustments, the approach can significantly improve the topical cohesion and accuracy of category hierarchy. A new measure is proposed for evaluating category hierarchy considering not only the balance of the hierarchical structure but also the accuracy of classification. Experiments show that the proposed approach is feasible and effective to adjust inappropriate category hierarchy. The proposed approach can be used to maintain the category hierarchy for managing various resources in dynamic application environment. It also provides an approach to specialize the current online category hierarchy to organize resources with more specific categories.
Resumo:
Default invariance is the idea that default does not change at any scale of law and finance. Default is a conserved quantity in a universe where fundamental principles of law and finance operate. It exists at the micro-level as part of the fundamental structure of every financial transaction, and at the macro- level, as a fixed critical point within the relatively stable phases of the law and finance cycle. A key point is that default is equivalent to maximizing uncertainty at the micro-level and at the macro-level, is equivalent to the phase transition where unbearable fluctuations occur in all forms of risk transformation, including maturity, liquidity and credit. As such, default invariance is the glue that links the micro and macro structures of law and finance. In this essay, we apply naïve category theory (NCT), a type of mapping logic, to these types of phenomena. The purpose of using NCT is to introduce a rigorous (but simple) mathematical methodology to law and finance discourse and to show that these types of structural considerations are of prime practical importance and significance to law and finance practitioners. These mappings imply a number of novel areas of investigation. From the micro- structure, three macro-approximations are implied. These approximations form the core analytical framework which we will use to examine the phenomena and hypothesize rules governing law and finance. Our observations from these approximations are grouped into five findings. While the entirety of the five findings can be encapsulated by the three approximations, since the intended audience of this paper is the non-specialist in law, finance and category theory, for ease of access we will illustrate the use of the mappings with relatively common concepts drawn from law and finance, focusing especially on financial contracts, derivatives, Shadow Banking, credit rating agencies and credit crises.
Resumo:
The paper problematises the category of noncitizenship. It traces its trajectory in accounts of inclusive citizenship and argues that it is difficult to theorise it as a distinct theoretical category outside of citizenship. To support this argument, the paper distinguishes between a pluralist, political and democratic variant of accounts of inclusive citizenship and it shows how they all end up reducing noncitizenship to a journey to citizenship. To overcome this limit, the paper develops the idea of subversive politicisation and suggests that injustices and inequalities can be challenged without falling back on the vocabulary of citizenship.
Resumo:
The literature shows that category management is an important concept and tool for retailers and suppliers, but that there is a trend to move to a more shopper-centric category management approach, linked to the shoppermarketing approach. However, the knowledge on this issue is scarce on some retailing sectors, like convenience stores. The present study is focused on convenience stores, with the main purpose of finding out to what extent non-major food retailers successfully adopt a shopper-centric category management. The study is relevant in order to evaluate if a more shopper-centric approach is adequate to smaller companies/stores. To accomplish that goal, an exploratory qualitative study was conducted among convenience store retailers and suppliers. Six semistructured face-to-face interviews were conducted with Commercial Directors and Trade Marketing Managers. This data was complemented with thirteen interviews with shopper marketing experts. The data was analyzed using thematic content analysis technique, identifying themes, categories, subcategories, units of meaning and relations. The results revealed that convenience store retailers use some of the principles and techniques of the shopper-marketing and shopper-centric category management approaches, which they do in a non-standardized and non-formal approach or process. Their suppliers (the manufacturers) do it in a more formal and structured manner, probably as a result of previous interaction with major supermarkets chains. Both direct and indirect evidences of a shopper-centric approach were found, which, however, were slight, discrete and not formal.
Resumo:
Neo-Dandy was a practice-led research project that explored histories of a quintessential men’s and womenswear garment from across the ages — the formal white dress shirt. The aim was to generate a body of radically new mens’ shirts that, whilst incorporating characteristics normally associated with womenswear, would remain acceptable to male wearers. A detailed study identified a broad spectrum of historical design approaches, ranging from the orthodox man’s shirt to the many variations of the women’s blouse. Within this spectrum a threshold was discovered where the men’s shirt morphed into the woman’s blouse — a ‘design moment’ that appeared to typify the dandy figure (a fashion character who subversively confronts dress norms of their day). The research analysed thousands of archive catwalk images from leading contemporary menswear designers, and of these, only a small number tampered appreciably with the men’s white dress shirt — suggesting a new realm of possibility for fashion design innovation. This led to the creation of a new body of work labelled ‘Neo-Dandy’. Sixty ‘concept shirts’ were produced, with differing styles and varying degrees of detailing, that fitted the brief of being acceptable to male wearers, eminently ‘wearable’ and on a threshold position between menswear and womenswear. These designs were each tested, documented, and assessed in their capacity to evolve the Neo-Dandy aesthetic. Based on these outcomes, a list of key design principles for achieving this aesthetic was identified to assist designers in further evolving this style. The creative work achieved substantial public acclaim with the ‘Neo Dandy Collection’ winning a prestigious Design Institute of Australia Award (Lifestyle category) and being one of four finalists in the prestigious overall field for design excellence. It was subsequently curated into three major Brisbane exhibitions — the ARC Biennial, at Artisan Gallery and the industry leader, the Mercedes Benz Fashion Festival. The collection was also exhibited at the Queensland Art Gallery.
Resumo:
Neo-Dandy was a practice-led research project that explored histories of a quintessential men’s and womenswear garment from across the ages — the formal white dress shirt. The aim was to generate a body of radically new mens’ shirts that incorporated characteristics normally associated with womenswear, whist remaining acceptable to male wearers. A detailed study identified a broad spectrum of historical design approaches, ranging from the orthodox man’s shirt to the many variations of the women’s blouse. Within this spectrum a threshold was discovered where the men’s shirt morphed into the woman’s blouse — a ‘design moment’ that appeared to typify the dandy figure (a fashion character who subversively confronts dress norms of their day). The research analysed thousands of archive catwalk images from leading contemporary menswear designers, and of these, only a small number tampered appreciably with the men’s white dress shirt — suggesting a new realm of possibility for fashion design innovation. This led to the creation of a new body of work labelled ‘Neo-Dandy’. Sixty ‘concept shirts’ were produced, with differing styles and varying degrees of detailing, that fitted the brief of being acceptable to male wearers, eminently ‘wearable’ and on a threshold position between menswear and womenswear. These designs were each tested, documented, and assessed in their capacity to evolve the Neo-Dandy aesthetic. Based on these outcomes, a list of key design principles for achieving this aesthetic was identified to assist designers in further evolving this style. The creative work achieved substantial public acclaim with the ‘Neo Dandy Collection’ winning a prestigious Design Institute of Australia Award (Lifestyle category) and being one of four finalists in the prestigious overall field for design excellence. It was subsequently curated into three major Brisbane exhibitions — the ARC Biennial, at Artisan Gallery and the industry leader, the Mercedes Benz Fashion Festival. The collection was also exhibited at the Queensland Art Gallery.
Resumo:
New mobile digital communication technologies present opportunities for advertisers to capitalize on the evolving relationships of consumers with their mobile devices and their desire to access enhanced information services while mobile (m-services). Consumers already use mobile devices (cell phones, personal mobile digital assistants) for traditional phone calls and message handling (e.g., Kalakota and Robinson, 2002; Sullivan Mort and Drennan, 2002). The combination of rapidly developing mobile digital technology and high uptake rates of mobile devices presents enormous potential for delivery of m-services through these devices (Bitner, Brown, and Meuter, 2000). M-services encompass a wide variety of types including the ability to trade stock, to book theater and movie tickets while accessing seating plans online, to send and receive text and pictures, and receive personalized direct advertising such as alerts for shopping bargains. Marketing communications, and specifically advertising, may be delivered as an m-service and termed m-services advertising, forming part of the broader category of m-services. However, advertising research has not yet addressed the area of m-services and needs to do so to be able to take advantage of the advanced interactivity (Yadav and Varadarajan, 2005) of mobile communication devices. Such advertising research is likely to help develop open attitudes and responses to new business models as has been advocated for other new technology such as advanced television (Tauder, 2005). In this article, we model the factors influencing the use of m-services, in the context of consumers' existing relationships with mobile devices. First, we address the value propositions underpinning consumer involvement with mobile devices. Next, we canvass the types of involvement relevant to this consumption domain and argue that involvement, together with personal attributes innovativeness and self-efficacy, will influence use of m-services. Finally, implications for advertising delivered as an m-service are discussed, the potential for m-services advertising as part of m-commerce are canvassed, and directions for future research identified.
Resumo:
We spent a fare amount of time thinking and debating where to draw the line between what is and what is not single-screen-based interactive media. This really is a tricky category. I would like to use this opportunity to raise certain issues about this very new category introduced this year to ifva. First of all, what do we mean by "interactive" media? If we conceptually or philosophically try to describe it, almost every artifact (not only those who are intended as a piece of art) can be perceived as "interactive" media as soon as one sees/ recognises it and begins interacting with it physically and/or mentally. What about when we limit this to computer related media? This certainly limits the scope, but well, it is becoming increasinly difficult to find art and design that are considered innovative without the use of computer. the term "single-screen" certainly makes it more specific, but as we saw from a range of works submitted to this category, people do come up with various interpretations to it. Some simply submitted work that can be viewed with computer screen, which didn't allow much user participation, while others provided various degrees of user/audience participation. What does "singel-screen-based interactive media" mean?
Resumo:
Principal Topic: It is well known that most new ventures suffer from a significant lack of resources, which increases the risk of failure (Shepherd, Douglas and Shanley, 2000) and makes it difficult to attract stakeholders and financing for the venture (Bhide & Stevenson, 1999). The Resource-Based View (RBV) (Barney, 1991; Wernerfelt, 1984) is a dominant theoretical base increasingly drawn on within Strategic Management. While theoretical contributions applying RBV in the domain of entrepreneurship can arguably be traced back to Penrose (1959), there has been renewed attention recently (e.g. Alvarez & Busenitz, 2001; Alvarez & Barney, 2004). This said, empirical work is in its infancy. In part, this may be due to a lack of well developed measuring instruments for testing ideas derived from RBV. The purpose of this study is to develop a measurement scales that can serve to assist such empirical investigations. In so doing we will try to overcome three deficiencies in current empirical measures used for the application of RBV to the entrepreneurship arena. First, measures for resource characteristics and configurations associated with typical competitive advantages found in entrepreneurial firms need to be developed. These include such things as alertness and industry knowledge (Kirzner, 1973), flexibility (Ebben & Johnson, 2005), strong networks (Lee et al., 2001) and within knowledge intensive contexts, unique technical expertise (Wiklund and Shepard, 2003). Second, the RBV has the important limitations of being relatively static and modelled on large, established firms. In that context, traditional RBV focuses on competitive advantages. However, newly established firms often face disadvantages, especially those associated with the liabilities of newness (Aldrich & Auster, 1986). It is therefore important in entrepreneurial contexts to expand to an investigation of responses to competitive disadvantage through an RBV lens. Conversely, recent research has suggested that resource constraints actually have a positive effect on firm growth and performance under some circumstances (e.g., George, 2005; Katila & Shane, 2005; Mishina et al., 2004; Mosakowski, 2002; cf. also Baker & Nelson, 2005). Third, current empirical applications of RBV measured levels or amounts of particular resources available to a firm. They infer that these resources deliver firms competitive advantage by establishing a relationship between these resource levels and performance (e.g. via regression on profitability). However, there is the opportunity to directly measure the characteristics of resource configurations that deliver competitive advantage, such as Barney´s well known VRIO (Valuable, Rare, Inimitable and Organized) framework (Barney, 1997). Key Propositions and Methods: The aim of our study is to develop and test scales for measuring resource advantages (and disadvantages) and inimitability for entrepreneurial firms. The study proceeds in three stages. The first stage developed our initial scales based on earlier literature. Where possible, we adapt scales based on previous work. The first block of the scales related to the level of resource advantages and disadvantages. Respondents were asked the degree to which each resource category represented an advantage or disadvantage relative to other businesses in their industry on a 5 point response scale: Major Disadvantage, Slight Disadvantage, No Advantage or Disadvantage, Slight Advantage and Major Advantage. Items were developed as follows. Network capabilities (3 items) were adapted from (Madsen, Alsos, Borch, Ljunggren & Brastad, 2006). Knowledge resources marketing expertise / customer service (3 items) and technical expertise (3 items) were adapted from Wiklund and Shepard (2003). flexibility (2 items), costs (4 items) were adapted from JIBS B97. New scales were developed for industry knowledge / alertness (3 items) and product / service advantages. The second block asked the respondent to nominate the most important resource advantage (and disadvantage) of the firm. For the advantage, they were then asked four questions to determine how easy it would be for other firms to imitate and/or substitute this resource on a 5 point likert scale. For the disadvantage, they were asked corresponding questions related to overcoming this disadvantage. The second stage involved two pre-tests of the instrument to refine the scales. The first was an on-line convenience sample of 38 respondents. The second pre-test was a telephone interview with a random sample of 31 Nascent firms and 47 Young firms (< 3 years in operation) generated using a PSED method of randomly calling households (Gartner et al. 2004). Several items were dropped or reworded based on the pre-tests. The third stage (currently in progress) is part of Wave 1 of CAUSEE (Nascent Firms) and FEDP (Young Firms), a PSED type study being conducted in Australia. The scales will be tested and analysed with a random sample of approximately 700 Nascent and Young firms respectively. In addition, a judgement sample of approximately 100 high potential businesses in each category will be included. Findings and Implications: The paper will report the results of the main study (stage 3 – currently data collection is in progress) will allow comparison of the level of resource advantage / disadvantage across various sub-groups of the population. Of particular interest will be a comparison of the high potential firms with the random sample. Based on the smaller pre-tests (N=38 and N=78) the factor structure of the items confirmed the distinctiveness of the constructs. The reliabilities are within an acceptable range: Cronbach alpha ranged from 0.701 to 0.927. The study will provide an opportunity for researchers to better operationalize RBV theory in studies within the domain of entrepreneurship. This is a fundamental requirement for the ability to test hypotheses derived from RBV in systematic, large scale research studies.
Resumo:
Objective: This paper explores the effects of perceived stage of cancer (PSOC) on carers' anxiety and depression during the patients' final year. Methods: A consecutive sample of patients and carers (N=98) were surveyed at regular intervals regarding PSOC, and anxiety and depression using the Hospital Anxiety and Depression Scale. Means were compared by gender using the Mann-Whitney U-test. The chi-square was used to analyse categorical data. Agreement between carers' and patients' PSOC was estimated using kappa statistics. Correlations between carers' PSOC and their anxiety and depression were calculated using the Spearman's rank correlation. Results: Over time, an increasing proportion of carers reported that the cancer was advanced, culminating at 43% near death. Agreement regarding PSOC was fair (kappa=0.29-0.34) until near death (kappa=0.21). Carers' anxiety increased over the year; depression increased in the final 6 months. Females were more anxious (p=0.049, 6 months; p=0.009, 3 months) than males, and more depressed until 1 month to death. The proportion of carers reporting moderate-severe anxiety almost doubled over the year to 27%, with more females in this category at 6 months (p=0.05). Carers with moderate-severe depression increased from 6 to 15% over the year. Increased PSOC was weakly correlated with increased anxiety and depression. Conclusions: Carers' anxiety exceeded depression in severity during advanced cancer. Females generally experienced greater anxiety and depression. Carers were more realistic than patients regarding the ultimate outcome, which was reflected in their declining mental health, particularly near the end.
Resumo:
This paper presents a phenomenographic analysis of the conceptions of teaching and learning held by a sample of 16 secondary school teachers in two Australian schools. It provides descriptions of four categories, derived from pooled data, of the ways in which these teachers thought about teaching and about learning, their teaching strategies, and their focus on student or content. The categories for teaching and learning are described with each teacher allocated to the category most typical of their conceptions of teaching and of learning. The lack of congruence, in some cases, between the conceptions of teaching and of learning held by these teachers is discussed.
Resumo:
Key topics: Since the birth of the Open Source movement in the mid-80's, open source software has become more and more widespread. Amongst others, the Linux operating system, the Apache web server and the Firefox internet explorer have taken substantial market shares to their proprietary competitors. Open source software is governed by particular types of licenses. As proprietary licenses only allow the software's use in exchange for a fee, open source licenses grant users more rights like the free use, free copy, free modification and free distribution of the software, as well as free access to the source code. This new phenomenon has raised many managerial questions: organizational issues related to the system of governance that underlie such open source communities (Raymond, 1999a; Lerner and Tirole, 2002; Lee and Cole 2003; Mockus et al. 2000; Tuomi, 2000; Demil and Lecocq, 2006; O'Mahony and Ferraro, 2007;Fleming and Waguespack, 2007), collaborative innovation issues (Von Hippel, 2003; Von Krogh et al., 2003; Von Hippel and Von Krogh, 2003; Dahlander, 2005; Osterloh, 2007; David, 2008), issues related to the nature as well as the motivations of developers (Lerner and Tirole, 2002; Hertel, 2003; Dahlander and McKelvey, 2005; Jeppesen and Frederiksen, 2006), public policy and innovation issues (Jullien and Zimmermann, 2005; Lee, 2006), technological competitions issues related to standard battles between proprietary and open source software (Bonaccorsi and Rossi, 2003; Bonaccorsi et al. 2004, Economides and Katsamakas, 2005; Chen, 2007), intellectual property rights and licensing issues (Laat 2005; Lerner and Tirole, 2005; Gambardella, 2006; Determann et al., 2007). A major unresolved issue concerns open source business models and revenue capture, given that open source licenses imply no fee for users. On this topic, articles show that a commercial activity based on open source software is possible, as they describe different possible ways of doing business around open source (Raymond, 1999; Dahlander, 2004; Daffara, 2007; Bonaccorsi and Merito, 2007). These studies usually look at open source-based companies. Open source-based companies encompass a wide range of firms with different categories of activities: providers of packaged open source solutions, IT Services&Software Engineering firms and open source software publishers. However, business models implications are different for each of these categories: providers of packaged solutions and IT Services&Software Engineering firms' activities are based on software developed outside their boundaries, whereas commercial software publishers sponsor the development of the open source software. This paper focuses on open source software publishers' business models as this issue is even more crucial for this category of firms which take the risk of investing in the development of the software. Literature at last identifies and depicts only two generic types of business models for open source software publishers: the business models of ''bundling'' (Pal and Madanmohan, 2002; Dahlander 2004) and the dual licensing business models (Välimäki, 2003; Comino and Manenti, 2007). Nevertheless, these business models are not applicable in all circumstances. Methodology: The objectives of this paper are: (1) to explore in which contexts the two generic business models described in literature can be implemented successfully and (2) to depict an additional business model for open source software publishers which can be used in a different context. To do so, this paper draws upon an explorative case study of IdealX, a French open source security software publisher. This case study consists in a series of 3 interviews conducted between February 2005 and April 2006 with the co-founder and the business manager. It aims at depicting the process of IdealX's search for the appropriate business model between its creation in 2000 and 2006. This software publisher has tried both generic types of open source software publishers' business models before designing its own. Consequently, through IdealX's trials and errors, I investigate the conditions under which such generic business models can be effective. Moreover, this study describes the business model finally designed and adopted by IdealX: an additional open source software publisher's business model based on the principle of ''mutualisation'', which is applicable in a different context. Results and implications: Finally, this article contributes to ongoing empirical work within entrepreneurship and strategic management on open source software publishers' business models: it provides the characteristics of three generic business models (the business model of bundling, the dual licensing business model and the business model of mutualisation) as well as conditions under which they can be successfully implemented (regarding the type of product developed and the competencies of the firm). This paper also goes further into the traditional concept of business model used by scholars in the open source related literature. In this article, a business model is not only considered as a way of generating incomes (''revenue model'' (Amit and Zott, 2001)), but rather as the necessary conjunction of value creation and value capture, according to the recent literature about business models (Amit and Zott, 2001; Chresbrough and Rosenblum, 2002; Teece, 2007). Consequently, this paper analyses the business models from these two components' point of view.
Resumo:
Reliable budget/cost estimates for road maintenance and rehabilitation are subjected to uncertainties and variability in road asset condition and characteristics of road users. The CRC CI research project 2003-029-C ‘Maintenance Cost Prediction for Road’ developed a method for assessing variation and reliability in budget/cost estimates for road maintenance and rehabilitation. The method is based on probability-based reliable theory and statistical method. The next stage of the current project is to apply the developed method to predict maintenance/rehabilitation budgets/costs of large networks for strategic investment. The first task is to assess the variability of road data. This report presents initial results of the analysis in assessing the variability of road data. A case study of the analysis for dry non reactive soil is presented to demonstrate the concept in analysing the variability of road data for large road networks. In assessing the variability of road data, large road networks were categorised into categories with common characteristics according to soil and climatic conditions, pavement conditions, pavement types, surface types and annual average daily traffic. The probability distributions, statistical means, and standard deviation values of asset conditions and annual average daily traffic for each type were quantified. The probability distributions and the statistical information obtained in this analysis will be used to asset the variation and reliability in budget/cost estimates in later stage. Generally, we usually used mean values of asset data of each category as input values for investment analysis. The variability of asset data in each category is not taken into account. This analysis method demonstrated that it can be used for practical application taking into account the variability of road data in analysing large road networks for maintenance/rehabilitation investment analysis.
Resumo:
Objective-To establish the demographic, health status and insurance determinants of pre-hospital ambulance non-usage for patients with emergency medical needs. Methods-Triage category, date of birth, sex, marital status, country of origin, method and time of arrival, ambulance insurance status, diagnosis, and disposal were collected for all patients who presented over a four month period (n=10 229) to the emergency department of a major provincial hospital. Data for patients with urgent (n=678) or critical care needs (n=332) who did not use pre-hospital care were analysed using Poisson regression. Results-Only a small percentage (6.6%) of the total sample were triaged as having urgent medical needs or critical care needs (3.2%). Predictors of usage for those with urgent care needs included age greater than 65 years (prevalence ratio (PR)=0.54; 95% confidence interval (CI)= 0.35 to 0.83), being admitted to intensive care or transferred to another hospital (PR=0.62; 95% CI=0.44 to 0.89) or ward (PR=0.72; 95% CI=0.56 to 0.93) and ambulance insurance status (PR=0.67; 95% CI=052 to 0.86). Sex, marital status, time of day and country of origin were not predictive of usage and non-usage. Predictors of usage for those with critical care needs included age 65 years or greater (PR=0.45; 95% CI=0.25 to 0.81) and a diagnosis of trauma (PR=0.49; 95% CI=0.26 to 0.92). A non-English speaking background was predictive of non-usage (PR=1.98; 95% CI=1.06 to 3.70). Sex, marital status, time of day, triage and ambulance insurance status were not predictive of non-usage. Conclusions-Socioeconomic and medical factors variously influence ambulance usage depending on the severity or urgency of the medical condition. Ambulance insurance status was less of an influence as severity of condition increased suggesting that, at a critical level of urgency, patients without insurance are willing to pay for a pre-hospital ambulance service.
Resumo:
Recent years have seen intense scrutiny focused on the reported ethical breaches of enterprises across the globe. At the forefront of the accompanying criticism are the actions of giant American firms such as WorldCom, Arthur Anderson, and Enron. However, such deviations from acceptable standards of conduct have not been confined to the American market. Australia endured its era of “corporate excess” in the 1980s [Milton-Smith, 1997]. As a result, a spate of ethics-based research was undertaken in the early 1990s. More recently, China has been identified as a major venue for behavior deemed to be unacceptable, even unsafe. Issues such as counterfeit fashion items, software, and automobile parts have been a concern for several years [Gonzalez, 2007]. Perhaps more disconcerting are the recent recalls of children’s products, many of which were produced for leading toy companies such as Mattel and Fisher-Price, because of the use of dangerous lead-based paint. As one might anticipate, news reports and consumer protection agencies have been quick to condemn any action that falls within the “controversial” category. Indeed, many segments of society characterize such actions as unethical behavior. One result of this increased level of concern is the higher level of attention given to ethics in higher education programs. Even accreditation bodies such as AACSB have virtually mandated the integration of ethics into the curriculum. As a consequence, academicians have ramped up their ethics-based research agendas.