826 resultados para knowing in consulting
Resumo:
This thesis examines the ways that libraries have employed computers to assist with housekeeping operations. It considers the relevance of such applications to company libraries in the construction industry, and describes more specifically the development of an integrated cataloguing and loan system. A review of the main features in the development of computerised ordering, cataloguing and circulation control systems shows that fully integrated packages are beginning to be completed, and that some libraries are introducing second generation programs. Cataloguing is the most common activity to be computerised, both at national and company level. Results from a sample of libraries in the construction industry suggest that the only computerised housekeeping system is at Taylor Woodrow. Most of the firms have access to an in-house computer, and some of the libraries, particularly those in firms of consulting engineers, might benefit from computerisation, but there are differing attitudes amongst the librarians towards the computer. A detailed study of the library at Taylor Woodrow resulted in a feasibility report covering all the areas of its activities. One of the main suggestions was the possible use of a computerised loans and cataloguing system. An integrated system to cover these two areas was programmed in Fortran and implemented. This new system provides certain benefits and saves staff time, but at the cost of time on the computer. Some improvements could be made by reprogramming, but it provides a general system for small technical libraries. A general equation comparing costs for manual and computerised operations is progressively simplified to a form where the annual saving from the computerised system is expressed in terms of staff and computer costs and the size of the library. This equation gives any library an indication of the savings or extra cost which would result from using the computerised system.
Resumo:
We classify the strategies by which management consultancies can create and sustain the institutional capital that makes it possible for them to extract competitive resources from their institutional context. Using examples from the German consulting industry, we show how localized competitive actions can enhance both individual firms’ positions, and also strengthen the collective institutional capital of the consulting industry thus legitimizing consulting services in broader sectors of society and facilitating access to requisite resources. Our findings counter the image of institutional entrepreneurship as individualistic, “heroic” action. We demonstrate how distributed, embedded actors can collectively shape the institutional context from within to enhance their institutional capital.
Resumo:
With few exceptions (e.g. Fincham & Clark, 2002; Lounsbury, 2002, 2007; Montgomery & Oliver, 2007), we know little about how emerging professions, such as management consulting, professionalize and establish their services as a taken-for-granted element of social life. This is surprising given that professionals have long been recognized as “institutional agents” (DiMaggio & Powell, 1983; Scott, 2008) (see Chapter 17) and professionalization projects have been closely associated with institutionalization (DiMaggio, 1991). Therefore, in this chapter we take a closer look at a specific type of entrepreneurship in PSFs; drawing on the concept of “institutional entrepreneurship” (DiMaggio, 1988; Garud, Hardy, & Maguire, 2007; Hardy & Maguire, 2008) we describe some generic strategies by which proto-professions can enhance their “institutional capital” (Oliver, 1997), that is, their capacity to extract institutionally contingent resources such as legitimacy, reputation, or client relationships from their environment.
Resumo:
In this paper a Hierarchical Analytical Network Process (HANP) model is demonstrated for evaluating alternative technologies for generating electricity from MSW in India. The technological alternatives and evaluation criteria for the HANP study are characterised by reviewing the literature and consulting experts in the field of waste management. Technologies reviewed in the context of India include landfill, anaerobic digestion, incineration, pelletisation and gasification. To investigate the sensitivity of the result, we examine variations in expert opinions and carry out an Analytical Hierarchy Process (AHP) analysis for comparison. We find that anaerobic digestion is the preferred technology for generating electricity from MSW in India. Gasification is indicated as the preferred technology in an AHP model due to the exclusion of criteria dependencies and in an HANP analysis when placing a high priority on net output and retention time. We conclude that HANP successfully provides a structured framework for recommending which technologies to pursue in India, and the adoption of such tools is critical at a time when key investments in infrastructure are being made. Therefore the presented methodology is thought to have a wider potential for investors, policy makers, researchers and plant developers in India and elsewhere. © 2013 Elsevier Ltd. All rights reserved.
Resumo:
Enterprise Resource Planning (ERP) projects are strategic and capital intensive, so failure may be costly and even cause bankruptcy of companies. Previous studies have proposed ways for improving implementation, but they are mostly generic and follow standardized project management practices as specified in various standards (e.g. the “project management body of knowledge” of the Project Management Institute). Because ERP is interdisciplinary (involving change management, project management and information technology management), it warrants a customized approach to managing risks throughout the life cycle of implementation and operation. Through a practical case study, this paper demonstrates a qualitative, user friendly approach to ERP project risk management. Firstly, through a literature review it identifies various risk factors in ERP implementation. Secondly, the risk management practices of a UK-based multinational consulting company in one of its clients are evaluated. The risk factors from the case study organization and literature are then compared and discussed.
Resumo:
The value of knowing about data availability and system accessibility is analyzed through theoretical models of Information Economics. When a user places an inquiry for information, it is important for the user to learn whether the system is not accessible or the data is not available, rather than not have any response. In reality, various outcomes can be provided by the system: nothing will be displayed to the user (e.g., a traffic light that does not operate, a browser that keeps browsing, a telephone that does not answer); a random noise will be displayed (e.g., a traffic light that displays random signals, a browser that provides disorderly results, an automatic voice message that does not clarify the situation); a special signal indicating that the system is not operating (e.g., a blinking amber indicating that the traffic light is down, a browser responding that the site is unavailable, a voice message regretting to tell that the service is not available). This article develops a model to assess the value of the information for the user in such situations by employing the information structure model prevailing in Information Economics. Examples related to data accessibility in centralized and in distributed systems are provided for illustration.
Resumo:
BACKGROUND: Suicide prevention can be improved by knowing which variables physicians take into account when considering hospitalization or discharge of patients who have attempted suicide. AIMS: To test whether suicide risk is an adequate explanatory variable for predicting admission to a psychiatric unit after a suicide attempt. METHODS: Analyses of 840 clinical records of patients who had attempted suicide (66.3% women) at four public general hospitals in Madrid (Spain). RESULTS: 180 (21.4%) patients were admitted to psychiatric units. Logistic regression analyses showed that explanatory variables predicting admission were: male gender; previous psychiatric hospitalization; psychiatric disorder; not having a substance-related disorder; use of a lethal method; delay until discovery of more than one hour; previous attempts; suicidal ideation; high suicidal planning; and lack of verbalization of adequate criticism of the attempt. CONCLUSIONS: Suicide risk appears to be an adequate explanatory variable for predicting the decision to admit a patient to a psychiatric ward after a suicide attempt, although the introduction of other variables improves the model. These results provide additional information regarding factors involved in everyday medical practice in emergency settings.
Resumo:
I describe and discuss a series of court cases which focus upon on decoding the meaning of slang terms. Examples include sexual slang used in a description by a child and an Internet Relay Chat containing a conspiracy to murder. I consider the task presented by these cases for the forensic linguist and the roles the linguist may assume in determining the meaning of slang terms for the Courts. These roles are identified as linguist as naïve interpreter, lexicographer, case researcher and cultural mediator. Each of these roles is suggestive of different strategies that might be used from consulting formal slang dictionaries and less formal Internet sources, to collecting case specific corpora and examining all the extraneous material in a particular case. Each strategy is evaluated both in terms of the strength of evidence provided and its applicability to the forensic context.
Resumo:
The spectral quality of radiation in the understory of two neotropical rainforests, Barro Colorado Island in Panama and La Selva in Costa Rica, is profoundly affected by the density of the canopy. Understory light conditions in both forests bear similar spectral characteristics. In both the greatest changes in spectral quality occur at low flux densities, as in the transition from extreme shade to small light flecks. Change in spectral quality, as assessed by the red: far-red (R:FR) ratio, the ratio of radiant energy 400-700: 300-1100 nm, and the ratio of quantum flux density 400-700:300-1100 nm, is strongly correlated with a drop in percentage of solar radiation as measurable by a quantum radiometer. Thus, by knowing the percentage of photosynthetic photon flux density (PPFD) in relation to full sunlight, it is possible to estimate the spectral quality in the forest at a particular time and microsite.
Resumo:
Database design is a difficult problem for non-expert designers. It is desirable to assist such designers during the problem solving process by means of a knowledge based (KB) system. A number of prototype KB systems have been proposed, however there are many shortcomings. Few have incorporated sufficient expertise in modeling relationships, particularly higher order relationships. There has been no empirical study that experimentally tested the effectiveness of any of these KB tools. Problem solving behavior of non-experts, whom the systems were intended to assist, has not been one of the bases for system design. In this project a consulting system for conceptual database design that addresses the above short comings was developed and empirically validated.^ The system incorporates (a) findings on why non-experts commit errors and (b) heuristics for modeling relationships. Two approaches to knowledge base implementation--system restrictiveness and decisional guidance--were used and compared in this project. The Restrictive approach is proscriptive and limits the designer's choices at various design phases by forcing him/her to follow a specific design path. The Guidance system approach which is less restrictive, provides context specific, informative and suggestive guidance throughout the design process. The main objectives of the study are to evaluate (1) whether the knowledge-based system is more effective than a system without the knowledge-base and (2) which knowledge implementation--restrictive or guidance--strategy is more effective. To evaluate the effectiveness of the knowledge base itself, the two systems were compared with a system that does not incorporate the expertise (Control).^ The experimental procedure involved the student subjects solving a task without using the system (pre-treatment task) and another task using one of the three systems (experimental task). The experimental task scores of those subjects who performed satisfactorily in the pre-treatment task were analyzed. Results are (1) The knowledge based approach to database design support lead to more accurate solutions than the control system; (2) No significant difference between the two KB approaches; (3) Guidance approach led to best performance; and (4) The subjects perceived the Restrictive system easier to use than the Guidance system. ^
Resumo:
This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as “histogram binning” inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. ^ Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. ^ The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. ^ These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. ^ In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation. ^
Resumo:
Recent advances in electronic and computer technologies lead to wide-spread deployment of wireless sensor networks (WSNs). WSNs have wide range applications, including military sensing and tracking, environment monitoring, smart environments, etc. Many WSNs have mission-critical tasks, such as military applications. Thus, the security issues in WSNs are kept in the foreground among research areas. Compared with other wireless networks, such as ad hoc, and cellular networks, security in WSNs is more complicated due to the constrained capabilities of sensor nodes and the properties of the deployment, such as large scale, hostile environment, etc. Security issues mainly come from attacks. In general, the attacks in WSNs can be classified as external attacks and internal attacks. In an external attack, the attacking node is not an authorized participant of the sensor network. Cryptography and other security methods can prevent some of external attacks. However, node compromise, the major and unique problem that leads to internal attacks, will eliminate all the efforts to prevent attacks. Knowing the probability of node compromise will help systems to detect and defend against it. Although there are some approaches that can be used to detect and defend against node compromise, few of them have the ability to estimate the probability of node compromise. Hence, we develop basic uniform, basic gradient, intelligent uniform and intelligent gradient models for node compromise distribution in order to adapt to different application environments by using probability theory. These models allow systems to estimate the probability of node compromise. Applying these models in system security designs can improve system security and decrease the overheads nearly in every security area. Moreover, based on these models, we design a novel secure routing algorithm to defend against the routing security issue that comes from the nodes that have already been compromised but have not been detected by the node compromise detecting mechanism. The routing paths in our algorithm detour those nodes which have already been detected as compromised nodes or have larger probabilities of being compromised. Simulation results show that our algorithm is effective to protect routing paths from node compromise whether detected or not.
Resumo:
The purpose of this project was to evaluate the use of remote sensing 1) to detect and map Everglades wetland plant communities at different scales; and 2) to compare map products delineated and resampled at various scales with the intent to quantify and describe the quantitative and qualitative differences between such products. We evaluated data provided by Digital Globe’s WorldView 2 (WV2) sensor with a spatial resolution of 2m and data from Landsat’s Thematic and Enhanced Thematic Mapper (TM and ETM+) sensors with a spatial resolution of 30m. We were also interested in the comparability and scalability of products derived from these data sources. The adequacy of each data set to map wetland plant communities was evaluated utilizing two metrics: 1) model-based accuracy estimates of the classification procedures; and 2) design-based post-classification accuracy estimates of derived maps.
Resumo:
Health consumers worldwide obtain nutrition information from various sources; however, the sources Trinidadians and Tobagonians accessed were unclear. This cross-sectional, descriptive study ascertained from which sources Trinidadians and Tobagonians obtained nutrition information. Participants (n = 845) were surveyed with questions regarding demographics and nutrition information sources. Nearly 100% agreed nutrition information was important. Persons 18-64 years old mainly accessed print media (p<0.01) and ≥ 65years old predominantly accessed the non-print media. Significantly more tertiary educated people, ≥ 35 years old, retrieved information from print media (p=0.001), health care professionals (p=0.001), food labels (p=0.006), and non-print media (p=0.03) when compared to those < 35 years with similar education. Tertiary educated people (67%), selected the Internet when compared to those with without tertiary education (33%) (p<0.001). Knowing the nutrition information sources accessed, dietitians will be able to provide consistent, accurate, age specific nutrition information and promote healthy eating among Trinidadians and Tobagonians.
Resumo:
Recent advances in electronic and computer technologies lead to wide-spread deployment of wireless sensor networks (WSNs). WSNs have wide range applications, including military sensing and tracking, environment monitoring, smart environments, etc. Many WSNs have mission-critical tasks, such as military applications. Thus, the security issues in WSNs are kept in the foreground among research areas. Compared with other wireless networks, such as ad hoc, and cellular networks, security in WSNs is more complicated due to the constrained capabilities of sensor nodes and the properties of the deployment, such as large scale, hostile environment, etc. Security issues mainly come from attacks. In general, the attacks in WSNs can be classified as external attacks and internal attacks. In an external attack, the attacking node is not an authorized participant of the sensor network. Cryptography and other security methods can prevent some of external attacks. However, node compromise, the major and unique problem that leads to internal attacks, will eliminate all the efforts to prevent attacks. Knowing the probability of node compromise will help systems to detect and defend against it. Although there are some approaches that can be used to detect and defend against node compromise, few of them have the ability to estimate the probability of node compromise. Hence, we develop basic uniform, basic gradient, intelligent uniform and intelligent gradient models for node compromise distribution in order to adapt to different application environments by using probability theory. These models allow systems to estimate the probability of node compromise. Applying these models in system security designs can improve system security and decrease the overheads nearly in every security area. Moreover, based on these models, we design a novel secure routing algorithm to defend against the routing security issue that comes from the nodes that have already been compromised but have not been detected by the node compromise detecting mechanism. The routing paths in our algorithm detour those nodes which have already been detected as compromised nodes or have larger probabilities of being compromised. Simulation results show that our algorithm is effective to protect routing paths from node compromise whether detected or not.