914 resultados para Using Lean tools
Resumo:
Este trabalho é um estudo exploratório sobre o Ambiente Comunicacional Internet que investiga tanto a possibilidade da influência de suas ferramentas de interação/comunicação sobre o comportamento sexual e de risco quanto o desenvolvimento de comportamento compulsivo no uso destas ferramentas na busca de parceiros sexuais. A metodologia adotada é, além da pesquisa bibliográfica, a da pesquisa exploratória, um levantamento e análise de dados quantitativos e pode ser considerada como pertencente ao paradigma tradicional empírico, pois a coleta de dados foi baseada em respostas a questionários semi-estruturados, aplicados a um grupo de informação composto por 428 estudantes universitários dos cursos ligados à área de Computação e Informática de uma instituição particular de Ensino Superior do município de São Paulo SP, Brasil. Para isso, obedece à Resolução do Conselho Nacional de Saúde CNS 196/96 e conta com o TCLE. Os resultados indicam que as práticas sexuais, a exposição a DST e vírus HIV e, particularmente, a tendência ao desenvolvimento do Transtorno de Adicção à Internet se distinguem de modo irrefutável. Os participantes que alegaram buscar parceiros sexuais reais na Internet são diretos nos seus objetivos, pois quando encontram esse parceiro concretizam o ato sexual, em ambientes impessoais, como por exemplo, o motel, e muitas vezes de modo arriscado no que toca à prevenção e à segurança no contato com outro. Destaca-se, ainda, que a compulsão não é reconhecida pelo grupo e que a procura de parceiros por intermédio das mídias digitais, para esse grupo, não está relacionada a itens negativos quanto a sua qualidade de vida o que suscita o estudo e a discussão mais aprofundada sobre a interação comunicação, sexo e Internet .(AU)
Resumo:
The work described was carried out as part of a collaborative Alvey software engineering project (project number SE057). The project collaborators were the Inter-Disciplinary Higher Degrees Scheme of the University of Aston in Birmingham, BIS Applied Systems Ltd. (BIS) and the British Steel Corporation. The aim of the project was to investigate the potential application of knowledge-based systems (KBSs) to the design of commercial data processing (DP) systems. The work was primarily concerned with BIS's Structured Systems Design (SSD) methodology for DP systems development and how users of this methodology could be supported using KBS tools. The problems encountered by users of SSD are discussed and potential forms of computer-based support for inexpert designers are identified. The architecture for a support environment for SSD is proposed based on the integration of KBS and non-KBS tools for individual design tasks within SSD - The Intellipse system. The Intellipse system has two modes of operation - Advisor and Designer. The design, implementation and user-evaluation of Advisor are discussed. The results of a Designer feasibility study, the aim of which was to analyse major design tasks in SSD to assess their suitability for KBS support, are reported. The potential role of KBS tools in the domain of database design is discussed. The project involved extensive knowledge engineering sessions with expert DP systems designers. Some practical lessons in relation to KBS development are derived from this experience. The nature of the expertise possessed by expert designers is discussed. The need for operational KBSs to be built to the same standards as other commercial and industrial software is identified. A comparison between current KBS and conventional DP systems development is made. On the basis of this analysis, a structured development method for KBSs in proposed - the POLITE model. Some initial results of applying this method to KBS development are discussed. Several areas for further research and development are identified.
Resumo:
The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.
Resumo:
Completing projects faster than the normal duration is always a challenge to the management of any project, as it often demands many paradigm shifts. Opportunities of globalization, competition from private sectors and multinationals force the management of public sector organizations in the Indian petroleum sector to take various aggressive strategies to maintain their profitability. Constructing infrastructure for handling petroleum products is one of them. Moreover, these projects are required to be completed in faster duration compared to normal schedules to remain competitive, to get faster return on investment, and to give longer project life. However, using conventional tools and techniques of project management, it is impossible to handle the problem of reducing the project duration from a normal period. This study proposes the use of concurrent engineering in managing projects for radically reducing project duration. The phases of the project are accomplished concurrently/simultaneously instead of in a series. The complexities that arise in managing projects are tackled through restructuring project organization, improving management commitment, strengthening project-planning activities, ensuring project quality, managing project risk objectively and integrating project activities through management information systems. These would not only ensure completion of projects in fast track, but also improve project effectiveness in terms of quality, cost effectiveness, team building, etc. and in turn overall productivity of the project organization would improve.
Resumo:
The basic construction concepts of many-valued intellectual systems, which are adequate to primal problems of person activity and using hybrid tools with many-valued of coding are considered. The many-valued intellectual systems being two-place, but simulating neuron processes of space toting which are different on a level of actions, inertial and threshold of properties of neurons diaphragms, and also modification of frequency of following of the transmitted messages are created. All enumerated properties and functions in point of fact are essential not only are discrete on time, but also many-valued.
Resumo:
The basic construction concepts of many-valued intellectual systems, which are adequate to primal problems of person activity and using hybrid tools with many-valued intellectual systems being two-place, but simulating neuron processes of space toting which are different on a level of actions, inertial and threshold of properties of neuron diaphragms, and also frequency modification of the following transmitted messages are created. All enumerated properties and functions in point of fact are essential not only are discrete on time, but also many-valued.
Resumo:
In the digital age the internet and the ICT devices changed our daily life and routines. It means we couldn't live without these services and devices anywhere (work, home, holiday, etc.). It can be experienced in the tourism sector; digital contents become key tools in the tourism of the 21st century; they will be able to adapt the traditional tourist guide methodology to the applications running on novel digital devices. Tourists belong to a new generation, an "ICT generation" using innovative tools, a new info-media to communicate. A possible direction for tourism development is to use modern ICT systems and devices. Besides participating in classical tours guided by travel guides, there is a new opportunity for individual tourists to enjoy high quality ICT based guided walks prepared on the knowledge of travel guides. The main idea of the GUIDE@HAND service is to use reusable, and create new tourism contents for an advanced mobile device, in order to give a contemporary answer to traditional systems of tourism information, by developing new tourism services based on digital contents for innovative mobile applications. The service is based on a new concept of enhancing territorial heritage and values, through knowledge, innovation, languages and multilingual solutions going along with new tourists‟ “sensitiveness”.
Resumo:
We have witnessed a rising interest – by both academic and managerial field – in the marketing application of Web 2.0 techniques. Yet the effective impact and change it brings is still unclarified. The importance of Web 2.0 is constantly on the rise. We see consumers join social networks, using social tools in an ever greater number, therefor it gives companies a new and effective tool for marketing communication and other marketing-related activities. In our research, we first aim to clarify the definition and boundaries of Web 2.0. Then through a literature review we collect some of the most important areas of marketing to be affected by this seemingly technological change. We also have a brief overview of challenges and risks firms face in this new environment.
Resumo:
Software architecture is the abstract design of a software system. It plays a key role as a bridge between requirements and implementation, and is a blueprint for development. The architecture represents a set of early design decisions that are crucial to a system. Mistakes in those decisions are very costly if they remain undetected until the system is implemented and deployed. This is where formal specification and analysis fits in. Formal specification makes sure that an architecture design is represented in a rigorous and unambiguous way. Furthermore, a formally specified model allows the use of different analysis techniques for verifying the correctness of those crucial design decisions. ^ This dissertation presented a framework, called SAM, for formal specification and analysis of software architectures. In terms of specification, formalisms and mechanisms were identified and chosen to specify software architecture based on different analysis needs. Formalisms for specifying properties were also explored, especially in the case of non-functional properties. In terms of analysis, the dissertation explored both the verification of functional properties and the evaluation of non-functional properties of software architecture. For the verification of functional property, methodologies were presented on how to apply existing model checking techniques on a SAM model. For the evaluation of non-functional properties, the dissertation first showed how to incorporate stochastic information into a SAM model, and then explained how to translate the model to existing tools and conducts the analysis using those tools. ^ To alleviate the analysis work, we also provided a tool to automatically translate a SAM model for model checking. All the techniques and methods described in the dissertation were illustrated by examples or case studies, which also served a purpose of advocating the use of formal methods in practice. ^
Resumo:
The IRS is using various tools to attack the status of various so-called 'outside consultants" being used by hospitality firms. This article will provide some planning tips so that the hospitality firm can minimize its chances of having workers reclassified as employees.
Resumo:
Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.
Resumo:
Computing devices have become ubiquitous in our technologically-advanced world, serving as vehicles for software applications that provide users with a wide array of functions. Among these applications are electronic learning software, which are increasingly being used to educate and evaluate individuals ranging from grade school students to career professionals. This study will evaluate the design and implementation of user interfaces in these pieces of software. Specifically, it will explore how these interfaces can be developed to facilitate the use of electronic learning software by children. In order to do this, research will be performed in the area of human-computer interaction, focusing on cognitive psychology, user interface design, and software development. This information will be analyzed in order to design a user interface that provides an optimal user experience for children. This group will test said interface, as well as existing applications, in order to measure its usability. The objective of this study is to design a user interface that makes electronic learning software more usable for children, facilitating their learning process and increasing their academic performance. This study will be conducted by using the Adobe Creative Suite to design the user interface and an Integrated Development Environment to implement functionality. These are digital tools that are available on computing devices such as desktop computers, laptops, and smartphones, which will be used for the development of software. By using these tools, I hope to create a user interface for electronic learning software that promotes usability while maintaining functionality. This study will address the increasing complexity of computing software seen today – an issue that has risen due to the progressive implementation of new functionality. This issue is having a detrimental effect on the usability of electronic learning software, increasing the learning curve for targeted users such as children. As we make electronic learning software an integral part of educational programs in our schools, it is important to address this in order to guarantee them a successful learning experience.
Resumo:
Computing devices have become ubiquitous in our technologically-advanced world, serving as vehicles for software applications that provide users with a wide array of functions. Among these applications are electronic learning software, which are increasingly being used to educate and evaluate individuals ranging from grade school students to career professionals. This study will evaluate the design and implementation of user interfaces in these pieces of software. Specifically, it will explore how these interfaces can be developed to facilitate the use of electronic learning software by children. In order to do this, research will be performed in the area of human-computer interaction, focusing on cognitive psychology, user interface design, and software development. This information will be analyzed in order to design a user interface that provides an optimal user experience for children. This group will test said interface, as well as existing applications, in order to measure its usability. The objective of this study is to design a user interface that makes electronic learning software more usable for children, facilitating their learning process and increasing their academic performance. This study will be conducted by using the Adobe Creative Suite to design the user interface and an Integrated Development Environment to implement functionality. These are digital tools that are available on computing devices such as desktop computers, laptops, and smartphones, which will be used for the development of software. By using these tools, I hope to create a user interface for electronic learning software that promotes usability while maintaining functionality. This study will address the increasing complexity of computing software seen today – an issue that has risen due to the progressive implementation of new functionality. This issue is having a detrimental effect on the usability of electronic learning software, increasing the learning curve for targeted users such as children. As we make electronic learning software an integral part of educational programs in our schools, it is important to address this in order to guarantee them a successful learning experience.
Resumo:
The detection and diagnosis of faults, ie., find out how , where and why failures occur is an important area of study since man came to be replaced by machines. However, no technique studied to date can solve definitively the problem. Differences in dynamic systems, whether linear, nonlinear, variant or invariant in time, with physical or analytical redundancy, hamper research in order to obtain a unique solution . In this paper, a technique for fault detection and diagnosis (FDD) will be presented in dynamic systems using state observers in conjunction with other tools in order to create a hybrid FDD. A modified state observer is used to create a residue that allows also the detection and diagnosis of faults. A bank of faults signatures will be created using statistical tools and finally an approach using mean squared error ( MSE ) will assist in the study of the behavior of fault diagnosis even in the presence of noise . This methodology is then applied to an educational plant with coupled tanks and other with industrial instrumentation to validate the system.
Resumo:
Coding process is a fundamental aspect of cerebral functioning. The sensory stimuli transformation in neurophysiological responses has been a research theme in several areas of Neuroscience. One of the most used ways to measure a neural code e ciency is by the use of Information Theory measures, such as mutual information. Using these tools, recent studies show that in the auditory cortex both local eld potentials (LFPs) and action potential spiking times code information about sound stimuli. However, there are no studies applying Information Theory tools to investigate the e ciency of codes that use postsynaptics potentials (PSPs), alone and associated with LFP analysis. These signals are related in the sense that LFPs are partly created by joint action of several PSPs. The present dissertation reports information measures between PSP and LFP responses obtained in the primary auditory cortex of anaesthetized rats and auditory stimuli of distinct frequencies. Our results show that PSP responses hold information about sound stimuli in comparable levels and even greater than LFP responses. We have also found that PSPs and LFPs code sound information independently, since the joint analysis of these signals did neither show synergy nor redundancy.