128 resultados para Spreadsheets
Resumo:
Most widely-used computer software packages, such as word processors, spreadsheets and web browsers, incorporate comprehensive help systems, partly because the software is meant for those with little technical knowledge. This paper identifies four systematic philosophies or approaches to help system delivery, namely the documentation approach, based on written documents, either paper-based or online; the training approach, either offered before the user starts working on the software or on-the-job; intelligent help, that is online, context-sensitive help or that relying on software agents; and finally an approach based on minimalism, defined as providing help only when and where it is needed.
Resumo:
Numa perspectiva sociológica, este trabalho tem o objetivo de identificar o perfil profissional do supervisor de ensino que atua na Secretaria de Estado da Educação de São Paulo, verificando se sua atuação está voltada ao pedagógico ou ao administrativo e qual o compromisso político e as concepções educacionais destes profissionais. Apoiando-se em Tragtenberg, Silva Jr e Huert, entre outros autores que abordam, especificamente, a supervisão de ensino, procurou-se fazer uma analise histórica da legislação, de planilhas e dos dados que foram colhidos através de depoimentos e entrevistas com supervisores de ensino efetivos de três Diretorias de Ensino da Grande São Paulo. O resultado da pesquisa aponta para a predominância de um perfil burocrático e de fiscalização na ação do supervisor de ensino, bem como de executor das políticas públicas e não partícipe da sua elaboração. O supervisor de ensino é um representante de um poder político que favorece uma determinada ordem, mesmo que exista uma tentativa de superação de uma representatividade unilateral dentro da supervisão de ensino.(AU)
Resumo:
This thesis presents a comparison of integrated biomass to electricity systems on the basis of their efficiency, capital cost and electricity production cost. Four systems are evaluated: combustion to raise steam for a steam cycle; atmospheric gasification to produce fuel gas for a dual fuel diesel engine; pressurised gasification to produce fuel gas for a gas turbine combined cycle; and fast pyrolysis to produce pyrolysis liquid for a dual fuel diesel engine. The feedstock in all cases is wood in chipped form. This is the first time that all three thermochemical conversion technologies have been compared in a single, consistent evaluation.The systems have been modelled from the transportation of the wood chips through pretreatment, thermochemical conversion and electricity generation. Equipment requirements during pretreatment are comprehensively modelled and include reception, storage, drying and communication. The de-coupling of the fast pyrolysis system is examined, where the fast pyrolysis and engine stages are carried out at separate locations. Relationships are also included to allow learning effects to be studied. The modelling is achieved through the use of multiple spreadsheets where each spreadsheet models part of the system in isolation and the spreadsheets are combined to give the cost and performance of a whole system.The use of the models has shown that on current costs the combustion system remains the most cost-effective generating route, despite its low efficiency. The novel systems only produce lower cost electricity if learning effects are included, implying that some sort of subsidy will be required during the early development of the gasification and fast pyrolysis systems to make them competitive with the established combustion approach. The use of decoupling in fast pyrolysis systems is a useful way of reducing system costs if electricity is required at several sites because• a single pyrolysis site can be used to supply all the generators, offering economies of scale at the conversion step. Overall, costs are much higher than conventional electricity generating costs for fossil fuels, due mainly to the small scales used. Biomass to electricity opportunities remain restricted to niche markets where electricity prices are high or feed costs are very low. It is highly recommended that further work examines possibilities for combined beat and power which is suitable for small scale systems and could increase revenues that could reduce electricity prices.
Resumo:
Despite the considerable potential of advanced manufacturing technologies (AMT) for improving the economic performance of many firms, a growing body of literature highlights many instances where realising this potential has proven to be a more difficult task than initially envisaged. Focussing upon the implementation of new manufacturing technologies in several smaller to medium sized enterprises (SME), the research examines the proposition that many of these problems can be attributed in part to inadequate consideration of the integrated nature of such technologies, where the effects of their implementation are not localised, but are felt throughout a business. The criteria for the economic evaluation of such technologies are seen as needing to reflect this, and the research develops an innovative methodology employing micro-computer based spreadsheets, to demonstrate how a series of financial models can be used to quantify the effects of new investments upon overall company performance. Case studies include: the development of a prototype machine based absorption costing system to assist in the evaluation of CNC machine tool purchases in a press making company; the economics and strategy of introducing a flexible manufacturing system for the production of ballscrews; and analysing the progressive introduction of computer based printing presses in a packaging and general print company. Complementary insights are also provided from discussion with the management of several other companies which have experienced technological change. The research was conducted as a collaborative CASE project in the Interdisciplinary Higher Degrees Scheme and was jointly funded by the SERC and Gaydon Technology Limited and later assisted by PE-Inbucon. The findings of the research shows that the introduction of new manufacturing technologies usually requires a fundamental rethink of the existing practices of a business. In particular, its implementation is seen as ideally needing to take place as part of a longer term business and manufacturing strategy, but that short term commercial pressures and limited resources often mean that firms experience difficulty in realising this. The use of a spreadsheet based methodology is shown to be of considerable assistance in evaluating new investments, and is seen as being the limit of sophistication that a smaller business is willing to employ. Several points for effective modelling practice are also given, together with an outline of the context in which a modelling approach is most applicable.
Resumo:
This thesis records the findings of a retrospective study of decompression illness (DCI) in the UK compressed air tunnelling industry since the mid-1980s. The thesis describes how the study arose, its scope and objectives, along with an overview of tunnelling and shaft-sinking. The development of compressed air working techniques is reviewed along with a description of decompression practice and DCI, and an outline of relevant legislation and guidance. The acquisition and manipulation of data to form a number of databases and spreadsheets on which the analysis was performed is discussed. That analysis examined measures of DCI incidence and quantified that incidence using these measures. Also considered is the variation in tolerance and susceptibility to DCI in the workforce, and the phenomenon of acclimatisation. An examination of the extent to which men worked on multiple contracts and the variation in their susceptibility to DCI on these contracts is included. Options are then considered for reducing the incidence of DCI. The first retained air-only decompression through the application of restrictions on exposure. The second related to the use of oxygen decompression. Finally the adequacy of the existing Regulations and Guidance is considered and recommendations made for possible changes to them, arising from the study. The main conclusions are that a number of measures of DCI incidence were identified, some more appropriate than others and that the incidence of DCI when so measured was high, disproportionately so in shift workers. No reasonably practicable restrictions on exposure were identified which would have allowed the retention of air-only decompression. Oxygen decompression looked promising but had yet to be used sufficiently extensively to generate enough data for analysis. Recommendations included one that an alternative technique for monitoring the effectiveness of decompression should be developed. The thesis ends with recommendations for further research.
Resumo:
This thesis describes work done exploring the application of expert system techniques to the domain of designing durable concrete. The nature of concrete durability design is described and some problems from the domain are discussed. Some related work on expert systems in concrete durability are described. Various implementation languages are considered - PROLOG and OPS5, and rejected in favour of a shell - CRYSTAL3 (later CRYSTAL4). Criteria for useful expert system shells in the domain are discussed. CRYSTAL4 is evaluated in the light of these criteria. Modules in various sub-domains (mix-design, sulphate attack, steel-corrosion and alkali aggregate reaction) are developed and organised under a BLACKBOARD system (called DEX). Extensions to the CRYSTAL4 modules are considered for different knowledge representations. These include LOTUS123 spreadsheets implementing models incorporating some of the mathematical knowledge in the domain. Design databases are used to represent tabular design knowledge. Hypertext representations of the original building standards texts are proposed as a tool for providing a well structured and extensive justification/help facility. A standardised approach to module development is proposed using hypertext development as a structured basis for expert systems development. Some areas of deficient domain knowledge are highlighted particularly in the use of data from mathematical models and in gaps and inconsistencies in the original knowledge source Digests.
Resumo:
The 2008 edition of the WKCI compares 145 regions across 19 knowledge economy benchmarks (full data for all indicators across each of the 19 benchmarks is contained in the accompanying Excel spreadsheets). This represents an increase of twenty regions compared to the last edition in 2005: nine from Europe, eight from North America, and three from Asia Pacific. These new regions were selected on the basis of a survey of a wide range of regions appearing to be become more internationally competitive. This year’s report also contains a special chapter on economic development in the three leading Chinese regions.
Resumo:
This paper addresses the dearth of research into material artifacts and how they are engaged in strategizing activities. Building on the strategy-as-practice perspective, and the notion of epistemic objects, we develop a typology of strategy practices that show how managers use material artifacts to strategize by a dual process of knowledge abstraction and substitution. Empirically, we study the practice of underwriting managers in reinsurance companies. Our findings first identify the artifacts – pictures, maps, data packs, spreadsheets and graphs – that these managers use to appraise reinsurance deals. Second, the analysis of each artifact’s situated use led to the identification of five practices for doing strategy with artifacts: physicalizing, locating, enumerating, analyzing, and selecting. Last, we developed a typology that shows how practices vary in terms of their level of abstraction from the physical properties of the risk being reinsured and unfold through a process of substituting. Our conceptual framework extends existing work in the strategy-as-practice field that calls for research into the role of material artifacts.
Resumo:
The use of spreadsheets has become routine in all aspects of business with usage growing across a range of functional areas and a continuing trend towards end user spreadsheet development. However, several studies have raised concerns about the accuracy of spreadsheet models in general, and of end user developed applications in particular, raising the risk element for users. High error rates have been discovered, even though the users/developers were confident that their spreadsheets were correct. The lack of an easy to use, context-sensitive validation methodology has been highlighted as a significant contributor to the problems of accuracy. This paper describes experiences in using a practical, contingency factor-based methodology for validation of spreadsheet-based DSS. Because the end user is often both the system developer and a stakeholder, the contingency factor-based validation methodology may need to be used in more than one way. The methodology can also be extended to encompass other DSS.
Resumo:
Report published in the Proceedings of the National Conference on "Education in the Information Society", Plovdiv, May, 2013
Resumo:
Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2014
Resumo:
Spreadsheets detailing plans to develop the Medical Library's journal collection.
Resumo:
Coordination of business processes is the management of dependencies where dependencies constrain how the tasks are performed. It has been traditionally done in an intuitive fashion, without paying much attention to the coordination load. Coordination load is being defined as the ratio between the time spent on coordination activities and the total task time. Previous efforts to understand and analyze coordination have resulted in mostly qualitative approaches to categorize and recommend coordination strategies. This research seeks to answer two questions: (1) How can we analyze process coordination problems to improve overall performance? (2) What guidance can we provide to reduce the coordination load of the process and consequently improve the organization's performance? Thus, this effort developed a quantitative measure for coordination load of business processes and a methodology to apply such measure. ^ This effort used a management simulation game to have a controlled laboratory environment enabling the manipulation of the task factors variability, analyzability, and interdependence to measure their impact on coordination load. The hypothesis was that the more variable, non-analyzable, and interdependent a process, the higher the coordination load, and that a higher coordination load would have a negative impact on performance. Coordination load was measured via the surrogate coordination time, and performance via profit. ^ A 22 x 31 full factorial design, with two replicates, was run to observe the impact on the variables coordination time and profit. Properly validated spreadsheets and questionnaires were used as data collection instruments for each scenario. The experimental results indicate that lower task analyzability (ρ=0.036) and higher task interdependence (ρ=0.000) lead to higher coordination load, and higher levels of task variability (ρ=0.049) lead to lower performance. However, contrary to the hypotheses postulated by this work, coordination load did not prove to be strong predictor of performance (correlation of -0.086). ^ These findings from the laboratory experiment and other lessons learned were incorporated to develop a quantitative measure, a tool (survey) to use to gather data for the variables in the measures, and a methodology to quantify coordination load of production business processes. The practicality of the methodology is demonstrated with an example.^
Resumo:
The city of Natal has a significant daylight availability, although it use isn’t systematically explored in schools architecture. In this context, this research aims to determine procedures for the analysis of the daylight performance in school design in Natal-RN. The method of analysis is divided in Visible Sky Factor (VSF), simulating and analyzing the results. The annual variation of the daylight behavior requires the adoption of dynamic simulation as data procedure. The classrooms were modelled in SketchUp, simulated in Daysim program and the results were assessed by means of spreadsheets in Microsoft Excel. The classrooms dimensions are 7.20mx 7.20m, with windows-to-wall-ratio (WWR) of 20%, 40% and 50%, and with different shading devices, such as standard horizontal overhang, sloped overhang, standard horizontal overhang with side view protection, standard horizontal overhang with a dropped edge, standard horizontal overhang with three horizontal louvers, double standard horizontal overhang, double standard horizontal overhang with three horizontal louvers, plus the use of shelf light in half the models with WWR of 40% and 50%. The data was organized in spreadsheets, with two intervals of UDI: between 300lux and 2000 lux and between 300lux and 3000lux. The simulation was performed with the weather file of 2009 to the city of NatalRN. The graphical outputs are illuminance curves, isolines of UDI among 300lux and 2000 lux and tables with index of occurrences of glare and to an UDI among 300lux 3000lux. The best UDI300-2000lux performance was evidenced to: Phase 1 (models with WWR of 20%), Phase 2 (models with WWR of 40% and 50% with light shelf). The best UDI300-3000lux performance was evidenced to: Phase 1 (models with WWR of 20% and 40% with light shelf) and Phase 2 (models with WWR of 40% and 50% with light shelf). The outputs prove that the daylight quality mainly depends on the shading system efficacy to avoid the glare occurrence, which determines the daylight discomfort. The bioclimatic recommendations of big openings with partial shading (with an opening with direct sunlight) resulted in illuminances level higher than the acceptable upper threshold. The improvement of the shading system percentage (from 73% to 91%) in medium-size of openings (WWR 40% and 50%) reduced or eliminate the glare occurrence without compromising the daylight zone depth (7.20m). The passive zone was determined for classrooms with satisfactory daylight performance, it was calculated the daylight zone depth rule-of-thumb with the ratio between daylight zone depth and the height of the window for different size of openings. The ratio ranged from 1.54 to 2.57 for WWR of 20%, 40% and 50% respectively. There was a reduction or elimination of glare in the passive area with light shelf, or with awning window shading.
Resumo:
The heavy part of the oil can be used for numerous purposes, e.g. to obtain lubricating oils. In this context, many researchers have been studying alternatives such separation of crude oil components, among which may be mentioned molecular distillation. Molecular distillation is a forced evaporation technique different from other conventional processes in the literature. This process can be classified as a special distillation case under high vacuum with pressures that reach extremely low ranges of the order of 0.1 Pascal. The evaporation and condensation surfaces must have a distance from each other of the magnitude order of mean free path of the evaporated molecules, that is, molecules evaporated easily reach the condenser, because they find a route without obstacles, what is desirable. Thus, the main contribution of this work is the simulation of the falling-film molecular distillation for crude oil mixtures. The crude oil was characterized using UniSim® Design and R430 Aspen HYSYS® V8.5. The results of this characterization were performed in spreadsheets of Microsoft® Excel®, calculations of the physicochemical properties of the waste of an oil sample, i.e., thermodynamic and transport. Based on this estimated properties and boundary conditions suggested by the literature, equations of temperature and concentration profiles were resolved through the implicit finite difference method using the programming language Visual Basic® (VBA) for Excel®. The result of the temperature profile showed consistent with the reproduced by literature, having in their initial values a slight distortion as a result of the nature of the studied oil is lighter than the literature, since the results of the concentration profiles were effective allowing realize that the concentration of the more volatile decreases and of the less volatile increases due to the length of the evaporator. According to the transport phenomena present in the process, the velocity profile tends to increase to a peak and then decreases, and the film thickness decreases, both as a function of the evaporator length. It is concluded that the simulation code in Visual Basic® language (VBA) is a final product of the work that allows application to molecular distillation of petroleum and other similar mixtures.