922 resultados para Location-aware process modeling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to the increasing acceptance of BPM, nowadays BPM tools are extensively used in organizations. Core to BPM are the process modeling languages, of which BPMN is the one that has been receiving most attention these days. Once a business process is described using BPMN, one can use a process simulation approach in order to find its optimized form. In this context, the simulation of business processes, such as those defined in BPMN, appears as an obvious way of improving processes. This paper analyzes the business process modeling and simulation areas, identifying the elements that must be present in the BPMN language in order to allow processes described in BPMN to be simulated. During this analysis a set of existing BPM tools, which support BPMN, are compared regarding their limitations in terms of simulation support.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This manual captures the experience of practitioners in the Iowa Department of Transportation’s (Iowa DOT’s) Office of Location and Environment (OLE). It also documents the need for coordinated project development efforts during the highway project planning, or location study phase and engineering design. The location study phase establishes: * The definition of, and need for, the highway improvement project * The range of alternatives and many key attributes of the project’s design * The recommended alternative, its impacts, and the agreed-to conditions for project approval The location study process involves developing engineering alternatives, collecting engineering and environmental data, and completing design refinements to accomplish functional designs. The items above also embody the basic content required for projects compliant with the National Environmental Policy Act (NEPA) of 19691, which directs federal agencies to use a systematic, interdisciplinary approach during the planning process whenever proposed actions (or “projects”) have the potential for environmental impacts. In doing so, NEPA requires coordination with stakeholders, review, comment, and public disclosure. Are location studies and environmental studies more about the process or the documents? If properly conducted, they concern both—unbiased and reasonable processes with quality and timely documents. In essence, every project is a story that needs to be told. Engineering and environmental regulations and guidance, as documented in this manual, will help project staff and managers become better storytellers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tekniikan kehitys ja elämänrytmin kiihtyminen ajaa eteenpäin sekä tarvetta että mahdollisuuksia toteuttaahenkilökohtaisia opastusjärjestelmiä. Lyhyen kantaman langattomat kommunikaatioteknologiat mahdollistavat erilaisten paikkasidonnaisten palveluiden, kuten opastusjärjestelmien toteuttamisen kohtuullisilla kustannuksilla. Markkinoilla olevista järjestelmistä sisätiloihin sijoittuvaan reaaliaikaiseen opastukseen soveltuvaa järjestelmää on vaikea löytää ja useimmat niistä hyödyntävät WLAN -tekniikkaa, joka ei ole kovin laajasti tuettuna matkapuhelimen kaltaisissa kannettavissa päätelaitteissa. Tässä työssä tuodaan esille Bluetooth -tekniikalla toteutettavien reaaliaikaisten järjestelmien ongelmia ja esitellään yksi ratkaisumalli. Toimintaa vaikeuttaa lähinnä pitkä yhteyden muodostumisaika, joka koostuu verkon laitteiden hakemiseen kuluvasta pitkästä vaikeasti kestoltaan arvioitavasta ajasta ja valittuun kohteeseen yhteyden muodostamiseen kuluneesta ajasta. Toteutetussa Bluetooth -opastusjärjestelmässä opastettavien laitteiden hakeminen liityntäpistettä vaihdettaessa on voitu jättää pois, koska yhteyden muodostamiseen vaaditut tiedot välitetään liityntäpisteille kiinteän Ethernet -verkon välityksellä. Työntuloksena syntyneen opastusjärjestelmän käyttökokemukset osoittavat opastusverkon suunnittelun olevan haastava tehtävä, mutta verkon toimintakuntoon saattamisen jälkeen järjestelmän suorituskykyyn saadaan huomattava parannus. Demonstraatiototeutus rajoittaa käytettävän laitteiston Linux-pohjaisiin järjestelmiin, vaikka laajemman käyttöönoton varmistamiseksi järjestelmä tulisi tehdä siirrettäväksiesimerkiksi Symbian -alustalle.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Erilaisten mobiiliverkkojen käytön yleistyessä nousee esiin uudenlaisia sovellusalueita, kuten esimerkiksi paikkatietoiset sovellukset. Mobiiliudesta johtuen sovellusten käyttötilanteet vaihtelevat. Käyttötilanteista voidaan kerätä tietoa ja käyttää tätä hyödyksi. Tilannetiedolla tarkoitetaan sovelluksen käyttötilanteeseen tai käyttäjään liittyvää lisätietoa. Paikka- ja tilannetietoisten sovellusten kehittäminen vaati monia ohjelmistokehitystä tukevia järjestelmiä. Tilannetiedon väljän määritelmän takia tilannetietoisten sovellusten kehitykselle ei ole vielä selkeitä toimintamalleja. Tilannetietoisten sovellusten kehitystä avustavia järjestelmiä on luotu etenkin tutkimuksessa, mutta nämä eivät ole vielä yleistyneet laajempaan käyttöön. Paikkatiedon käyttö sen sijaan on hyvinkin standardoitua, mutta paikkatieto nähdään vain osana tilannetietoa. Tässä diplomityössä toteutettiin paikka- ja tilannetiedon sovelluskehitystä tukevia järjestelmiä, joilla paikka- ja tilannetiedon hyödyntäminen sovelluksissa mahdollistettiin. WLAN - verkosta saadun paikkatiedon hyödyntämiseen toteutettiin SOAP -palvelurajapinta. Tilannetiedon hyödyntämiseksi toteutettiin MUPE -sovellusympäristöön välittäjäkomponentteja paikka-, sää- ja kuntopyörän harjoitustiedolle sekä RFID -havaintotiedoille. Näitä komponetteja käytettiin tilannetietoisten sovellusten luomiseen sekä tietoliikennetekniikan laitoksen codecamp -kursseilla, että tilannetietoisessa pelisovelluksessa. Työn tuloksena saatiin toimivia sovelluksia, ja välittäjäkomponentit sovellusten luomiseen. Työn tuloksena voidaan todeta, että ilman tilannetietoista sovelluskehitystä tukevia komponentteja, olisi tämäntyyppinen sovelluskehitys huomattavasti vaativampaa. Tukevat komponentit helpottavat sovelluskehitystä, mutta helposti myös rajaavat kehitysmahdollisuuksia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT This study aims at presenting the process of machine design and agricultural implements by means of a reference model, formulated with the purpose of explaining the development activities of new products, serving as a guideline to coach human resources and to assist in formalizing the process in small and medium-sized businesses (SMB), i.e. up to 500 employees. The methodology used included the process modeling, carried out from case studies in the SMB, and the study of reference models in literature. The modeling formalism used was based on the IDEF0 standard, which identifies the dimensions required for the model detailing: input information; activities; tasks; knowledge domains; mechanisms; controls and information produced. These dimensions were organized in spreadsheets and graphs. As a result, a reference model with 27 activities and 71 tasks was obtained, distributed over four phases of the design process. The evaluation of the model was carried out by the companies participating in the case studies and by experts, who concluded that the model explains the actions needed to develop new products in SMB.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Process management refers to improving the key functions of a company. The main functions of the case company - project management, procurement, finance, and human resource - use their own separate systems. The case company is in the process of changing its software. Different functions will use the same system in the future. This software change causes changes in some of the company’s processes. Project cash flow forecasting process is one of the changing processes. Cash flow forecasting ensures the sufficiency of money and prepares for possible changes in the future. This will help to ensure the company’s viability. The purpose of the research is to describe a new project cash flow forecasting process. In addition, the aim is to analyze the impacts of the process change, with regard to the project control department’s workload and resources through the process measurement, and how the impacts take the department’s future operations into account. The research is based on process management. Processes, their descriptions, and the way the process management uses the information, are discussed in the theory part of this research. The theory part is based on literature and articles. Project cash flow and forecasting-related benefits are also discussed. After this, the project cash flow forecasting as-is and to-be processes are described by utilizing information, obtained from the theoretical part, as well as the know-how of the project control department’s personnel. Written descriptions and cross-functional flowcharts are used for descriptions. Process measurement is based on interviews with the personnel – mainly cost controllers and department managers. The process change and the integration of two processes will allow work time for other things, for example, analysis of costs. In addition to the quality of the cash flow information will improve compared to the as-is process. Analyzing the department’s other main processes, department’s roles, and their responsibilities should be checked and redesigned. This way, there will be an opportunity to achieve the best possible efficiency and cost savings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study focuses on the onset of southwest monsoon over Kerala. India Meteorological Department (IMD) has been using a semi-objective method to define monsoon onset. The main objectives of the study are to understand the monsoon onset processes, to simulate monsoon onset in a GCM using as input the atmospheric conditions and Sea Surface Temperature, 10 days earlier to the onset, to develop a method for medium range prediction of the date of onset of southwest monsoon over Kerala and to examine the possibility of objectively defining the date of Monsoon Onset over Kerala (MOK). It gives a broad description of regional monsoon systems and monsoon onsets over Asia and Australia. Asian monsoon includes two separate subsystems, Indain monsoon and East Asian monsoon. It is seen from this study that the duration of the different phases of the onset process are dependent on the period of ISO. Based on the study of the monsoon onset process, modeling studies can be done for better understanding of the ocean-atmosphere interaction especially those associated with the warm pool in the Bay of Bengal and the Arabian Sea.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Managing the great complexity of enterprise system, due to entities numbers, decision and process varieties involved to be controlled results in a very hard task because deals with the integration of its operations and its information systems. Moreover, the enterprises find themselves in a constant changing process, reacting in a dynamic and competitive environment where their business processes are constantly altered. The transformation of business processes into models allows to analyze and redefine them. Through computing tools usage it is possible to minimize the cost and risks of an enterprise integration design. This article claims for the necessity of modeling the processes in order to define more precisely the enterprise business requirements and the adequate usage of the modeling methodologies. Following these patterns, the paper concerns the process modeling relative to the domain of demand forecasting as a practical example. The domain of demand forecasting was built based on a theoretical review. The resulting models considered as reference model are transformed into information systems and have the aim to introduce a generic solution and be start point of better practical forecasting. The proposal is to promote the adequacy of the information system to the real needs of an enterprise in order to enable it to obtain and accompany better results, minimizing design errors, time, money and effort. The enterprise processes modeling are obtained with the usage of CIMOSA language and to the support information system it was used the UML language.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this work is to introduce and demonstrate the technical feasibility of the continuous flash fermentation for the production of butanol. The evaluation was carried out through mathematical modeling and computer simulation which is a good approach in such a process development stage. The process consists of three interconnected units, as follows: the fermentor, the cell retention system (tangential microfiltration) and the vacuum flash vessel (responsible for the continuous recovery of butanol from the broth). The efficiency of this process was experimentally validated for the ethanol fermentation, whose main results are also shown. With the proposed design the concentration of butanol in the fermentor was lowered from 11.3 to 7.8 g/l, which represented a significant reduction in the inhibitory effect. As a result, the final concentration of butanol was 28.2 g/l for a broth with 140 g/l of glucose. Solvents productivity and yield were, respectively, 11.7 g/l.h and 33.5 % for a sugar conversion of 95.6 %. Positive aspects about the flash fermentation process are the solvents productivity, the use of concentrated sugar solution and the final butanol concentration. The last two features can be responsible for a meaningful reduction in the distillation costs and result in environmental benefits due to lower quantities of wastewater generated by the process. © 2008 Berkeley Electronic Press. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the last few years, Business Process Management (BPM) has achieved increasing popularity and dissemination. An analysis of the underlying assumptions of BPM shows that it pursues two apparently contradicting goals: on the one hand it aims at formalising work practices into business process models; on the other hand, it intends to confer flexibility to the organization - i.e. to maintain its ability to respond to new and unforeseen situations. This paper analyses the relationship between formalisation and flexibility in business process modelling by means of an empirical case study of a BPM project in an aircraft maintenance company. A qualitative approach is adopted based on the Actor-Network Theory. The paper offers two major contributions: (a) it illustrates the sociotechnical complexity involved in BPM initiatives; (b) it points towards a multidimensional understanding of the relation between formalization and flexibility in BPM projects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In electronic commerce, systems development is based on two fundamental types of models, business models and process models. A business model is concerned with value exchanges among business partners, while a process model focuses on operational and procedural aspects of business communication. Thus, a business model defines the what in an e-commerce system, while a process model defines the how. Business process design can be facilitated and improved by a method for systematically moving from a business model to a process model. Such a method would provide support for traceability, evaluation of design alternatives, and seamless transition from analysis to realization. This work proposes a unified framework that can be used as a basis to analyze, to interpret and to understand different concepts associated at different stages in e-Commerce system development. In this thesis, we illustrate how UN/CEFACT’s recommended metamodels for business and process design can be analyzed, extended and then integrated for the final solutions based on the proposed unified framework. Also, as an application of the framework, we demonstrate how process-modeling tasks can be facilitated in e-Commerce system design. The proposed methodology, called BP3 stands for Business Process Patterns Perspective. The BP3 methodology uses a question-answer interface to capture different business requirements from the designers. It is based on pre-defined process patterns, and the final solution is generated by applying the captured business requirements by means of a set of production rules to complete the inter-process communication among these patterns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With a steady increase of regulatory requirements for business processes, automation support of compliance management is a field garnering increasing attention in Information Systems research. Several approaches have been developed to support compliance checking of process models. One major challenge for such approaches is their ability to handle different modeling techniques and compliance rules in order to enable widespread adoption and application. Applying a structured literature search strategy, we reflect and discuss compliance-checking approaches in order to provide an insight into their generalizability and evaluation. The results imply that current approaches mainly focus on special modeling techniques and/or a restricted set of types of compliance rules. Most approaches abstain from real-world evaluation which raises the question of their practical applicability. Referring to the search results, we propose a roadmap for further research in model-based business process compliance checking.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the last decades, neuropsychological theories tend to consider cognitive functions as a result of the whole brainwork and not as individual local areas of its cortex. Studies based on neuroimaging techniques have increased in the last years, promoting an exponential growth of the body of knowledge about relations between cognitive functions and brain structures [1]. However, so fast evolution make complicated to integrate them in verifiable theories and, even more, translated in to cognitive rehabilitation. The aim of this research work is to develop a cognitive process-modeling tool. The purpose of this system is, in the first term, to represent multidimensional data, from structural and functional connectivity, neuroimaging, data from lesion studies and derived data from clinical intervention [2][3]. This will allow to identify consolidated knowledge, hypothesis, experimental designs, new data from ongoing studies and emerging results from clinical interventions. In the second term, we pursuit to use Artificial Intelligence to assist in decision making allowing to advance towards evidence based and personalized treatments in cognitive rehabilitation. This work presents the knowledge base design of the knowledge representation tool. It is compound of two different taxonomies (structure and function) and a set of tags linking both taxonomies at different levels of structural and functional organization. The remainder of the abstract is organized as follows: Section 2 presents the web application used for gathering necessary information for generating the knowledge base, Section 3 describes knowledge base structure and finally Section 4 expounds reached conclusions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Los Centros de Datos se encuentran actualmente en cualquier sector de la economía mundial. Están compuestos por miles de servidores, dando servicio a los usuarios de forma global, las 24 horas del día y los 365 días del año. Durante los últimos años, las aplicaciones del ámbito de la e-Ciencia, como la e-Salud o las Ciudades Inteligentes han experimentado un desarrollo muy significativo. La necesidad de manejar de forma eficiente las necesidades de cómputo de aplicaciones de nueva generación, junto con la creciente demanda de recursos en aplicaciones tradicionales, han facilitado el rápido crecimiento y la proliferación de los Centros de Datos. El principal inconveniente de este aumento de capacidad ha sido el rápido y dramático incremento del consumo energético de estas infraestructuras. En 2010, la factura eléctrica de los Centros de Datos representaba el 1.3% del consumo eléctrico mundial. Sólo en el año 2012, el consumo de potencia de los Centros de Datos creció un 63%, alcanzando los 38GW. En 2013 se estimó un crecimiento de otro 17%, hasta llegar a los 43GW. Además, los Centros de Datos son responsables de más del 2% del total de emisiones de dióxido de carbono a la atmósfera. Esta tesis doctoral se enfrenta al problema energético proponiendo técnicas proactivas y reactivas conscientes de la temperatura y de la energía, que contribuyen a tener Centros de Datos más eficientes. Este trabajo desarrolla modelos de energía y utiliza el conocimiento sobre la demanda energética de la carga de trabajo a ejecutar y de los recursos de computación y refrigeración del Centro de Datos para optimizar el consumo. Además, los Centros de Datos son considerados como un elemento crucial dentro del marco de la aplicación ejecutada, optimizando no sólo el consumo del Centro de Datos sino el consumo energético global de la aplicación. Los principales componentes del consumo en los Centros de Datos son la potencia de computación utilizada por los equipos de IT, y la refrigeración necesaria para mantener los servidores dentro de un rango de temperatura de trabajo que asegure su correcto funcionamiento. Debido a la relación cúbica entre la velocidad de los ventiladores y el consumo de los mismos, las soluciones basadas en el sobre-aprovisionamiento de aire frío al servidor generalmente tienen como resultado ineficiencias energéticas. Por otro lado, temperaturas más elevadas en el procesador llevan a un consumo de fugas mayor, debido a la relación exponencial del consumo de fugas con la temperatura. Además, las características de la carga de trabajo y las políticas de asignación de recursos tienen un impacto importante en los balances entre corriente de fugas y consumo de refrigeración. La primera gran contribución de este trabajo es el desarrollo de modelos de potencia y temperatura que permiten describes estos balances entre corriente de fugas y refrigeración; así como la propuesta de estrategias para minimizar el consumo del servidor por medio de la asignación conjunta de refrigeración y carga desde una perspectiva multivariable. Cuando escalamos a nivel del Centro de Datos, observamos un comportamiento similar en términos del balance entre corrientes de fugas y refrigeración. Conforme aumenta la temperatura de la sala, mejora la eficiencia de la refrigeración. Sin embargo, este incremente de la temperatura de sala provoca un aumento en la temperatura de la CPU y, por tanto, también del consumo de fugas. Además, la dinámica de la sala tiene un comportamiento muy desigual, no equilibrado, debido a la asignación de carga y a la heterogeneidad en el equipamiento de IT. La segunda contribución de esta tesis es la propuesta de técnicas de asigación conscientes de la temperatura y heterogeneidad que permiten optimizar conjuntamente la asignación de tareas y refrigeración a los servidores. Estas estrategias necesitan estar respaldadas por modelos flexibles, que puedan trabajar en tiempo real, para describir el sistema desde un nivel de abstracción alto. Dentro del ámbito de las aplicaciones de nueva generación, las decisiones tomadas en el nivel de aplicación pueden tener un impacto dramático en el consumo energético de niveles de abstracción menores, como por ejemplo, en el Centro de Datos. Es importante considerar las relaciones entre todos los agentes computacionales implicados en el problema, de forma que puedan cooperar para conseguir el objetivo común de reducir el coste energético global del sistema. La tercera contribución de esta tesis es el desarrollo de optimizaciones energéticas para la aplicación global por medio de la evaluación de los costes de ejecutar parte del procesado necesario en otros niveles de abstracción, que van desde los nodos hasta el Centro de Datos, por medio de técnicas de balanceo de carga. Como resumen, el trabajo presentado en esta tesis lleva a cabo contribuciones en el modelado y optimización consciente del consumo por fugas y la refrigeración de servidores; el modelado de los Centros de Datos y el desarrollo de políticas de asignación conscientes de la heterogeneidad; y desarrolla mecanismos para la optimización energética de aplicaciones de nueva generación desde varios niveles de abstracción. ABSTRACT Data centers are easily found in every sector of the worldwide economy. They consist of tens of thousands of servers, serving millions of users globally and 24-7. In the last years, e-Science applications such e-Health or Smart Cities have experienced a significant development. The need to deal efficiently with the computational needs of next-generation applications together with the increasing demand for higher resources in traditional applications has facilitated the rapid proliferation and growing of data centers. A drawback to this capacity growth has been the rapid increase of the energy consumption of these facilities. In 2010, data center electricity represented 1.3% of all the electricity use in the world. In year 2012 alone, global data center power demand grew 63% to 38GW. A further rise of 17% to 43GW was estimated in 2013. Moreover, data centers are responsible for more than 2% of total carbon dioxide emissions. This PhD Thesis addresses the energy challenge by proposing proactive and reactive thermal and energy-aware optimization techniques that contribute to place data centers on a more scalable curve. This work develops energy models and uses the knowledge about the energy demand of the workload to be executed and the computational and cooling resources available at data center to optimize energy consumption. Moreover, data centers are considered as a crucial element within their application framework, optimizing not only the energy consumption of the facility, but the global energy consumption of the application. The main contributors to the energy consumption in a data center are the computing power drawn by IT equipment and the cooling power needed to keep the servers within a certain temperature range that ensures safe operation. Because of the cubic relation of fan power with fan speed, solutions based on over-provisioning cold air into the server usually lead to inefficiencies. On the other hand, higher chip temperatures lead to higher leakage power because of the exponential dependence of leakage on temperature. Moreover, workload characteristics as well as allocation policies also have an important impact on the leakage-cooling tradeoffs. The first key contribution of this work is the development of power and temperature models that accurately describe the leakage-cooling tradeoffs at the server level, and the proposal of strategies to minimize server energy via joint cooling and workload management from a multivariate perspective. When scaling to the data center level, a similar behavior in terms of leakage-temperature tradeoffs can be observed. As room temperature raises, the efficiency of data room cooling units improves. However, as we increase room temperature, CPU temperature raises and so does leakage power. Moreover, the thermal dynamics of a data room exhibit unbalanced patterns due to both the workload allocation and the heterogeneity of computing equipment. The second main contribution is the proposal of thermal- and heterogeneity-aware workload management techniques that jointly optimize the allocation of computation and cooling to servers. These strategies need to be backed up by flexible room level models, able to work on runtime, that describe the system from a high level perspective. Within the framework of next-generation applications, decisions taken at this scope can have a dramatical impact on the energy consumption of lower abstraction levels, i.e. the data center facility. It is important to consider the relationships between all the computational agents involved in the problem, so that they can cooperate to achieve the common goal of reducing energy in the overall system. The third main contribution is the energy optimization of the overall application by evaluating the energy costs of performing part of the processing in any of the different abstraction layers, from the node to the data center, via workload management and off-loading techniques. In summary, the work presented in this PhD Thesis, makes contributions on leakage and cooling aware server modeling and optimization, data center thermal modeling and heterogeneityaware data center resource allocation, and develops mechanisms for the energy optimization for next-generation applications from a multi-layer perspective.