924 resultados para Reasonable Lenght of Process


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Development-engineers use in their work languages intended for software or hardware systems design, and test engineers utilize languages effective in verification, analysis of the systems properties and testing. Automatic interfaces between languages of these kinds are necessary in order to avoid ambiguous understanding of specification of models of the systems and inconsistencies in the initial requirements for the systems development. Algorithm of automatic translation of MSC (Message Sequence Chart) diagrams compliant with MSC’2000 standard into Petri Nets is suggested in this paper. Each input MSC diagram is translated into Petri Net (PN), obtained PNs are sequentially composed in order to synthesize a whole system in one final combined PN. The principle of such composition is defined through the basic element of MSC language — conditions. While translating reference table is developed for maintenance of consistent coordination between the input system’s descriptions in MSC language and in PN format. This table is necessary to present the results of analysis and verification on PN in suitable for the development-engineer format of MSC diagrams. The proof of algorithm correctness is based on the use of process algebra ACP. The most significant feature of the given algorithm is the way of handling of conditions. The direction for future work is the development of integral, partially or completely automated technological process, which will allow designing system, testing and verifying its various properties in the one frame.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the presented work the problem of management business-processes with changeable structure is considered and situational based approach to its decision is offered. The approach is based on situational model of management business-process according to which process is represented as a set of situations. The script defining necessary actions is connected with each situation. Management of process is carried out by means of the rules formalizing functional requirements to processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fermentation processes as objects of modelling and high-quality control are characterized with interdependence and time-varying of process variables that lead to non-linear models with a very complex structure. This is why the conventional optimization methods cannot lead to a satisfied solution. As an alternative, genetic algorithms, like the stochastic global optimization method, can be applied to overcome these limitations. The application of genetic algorithms is a precondition for robustness and reaching of a global minimum that makes them eligible and more workable for parameter identification of fermentation models. Different types of genetic algorithms, namely simple, modified and multi-population ones, have been applied and compared for estimation of nonlinear dynamic model parameters of fed-batch cultivation of S. cerevisiae.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In today’s modern manufacturing industry there is an increasing need to improve internal processes to meet diverse client needs. Process re-engineering is an important activity that is well understood by industry but its rate of application within small to medium size enterprises (SME) is less developed. Business pressures shift the focus of SMEs toward winning new projects and contracts rather than developing long-term, sustainable manufacturing processes. Variations in manufacturing processes are inevitable, but the amount of non-conformity often exceeds the acceptable levels. This paper is focused on the re-engineering of the manufacturing and verification procedure for discrete parts production with the aim of enhancing process control and product verification. The ideologies of the ‘Push’ and ‘Pull’ approaches to manufacturing are useful in the context of process re-engineering for data improvement. Currently information is pulled from the market and prominent customers, and manufacturing companies always try to make the right product, by following customer procedures that attempt to verify against specifications. This approach can result in significant quality control challenges. The aim of this paper is to highlight the importance of process re-engineering in product verification in SMEs. Leadership, culture, ownership and process management are among the main attributes required for the successful deployment of process re-engineering. This paper presents the findings from a case study showcasing the application of a modified re-engingeering method for the manufacturing and verification process. The findings from the case study indicate there are several advantages to implementing the re-engineering method outlined in this paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The production of recombinant therapeutic proteins is an active area of research in drug development. These bio-therapeutic drugs target nearly 150 disease states and promise to bring better treatments to patients. However, if new bio-therapeutics are to be made more accessible and affordable, improvements in production performance and optimization of processes are necessary. A major challenge lies in controlling the effect of process conditions on production of intact functional proteins. To achieve this, improved tools are needed for bio-processing. For example, implementation of process modeling and high-throughput technologies can be used to achieve quality by design, leading to improvements in productivity. Commercially, the most sought after targets are secreted proteins due to the ease of handling in downstream procedures. This chapter outlines different approaches for production and optimization of secreted proteins in the host Pichia pastoris. © 2012 Springer Science+business Media, LLC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

New media platforms have changed the media landscape forever, as they have altered our perceptions of the limits of communication, and reception of information. Platforms such as Facebook, Twitter and WhatsApp enable individuals to circumvent the traditional mass media, converging audience and producer to create millions of ‘citizen journalists’. This new breed of journalist uses these platforms as a way of, not only receiving news, but of instantaneously, and often spontaneously, expressing opinions and venting and sharing emotions, thoughts and feelings. They are liberated from cultural and physical restraints, such as time, space and location, and they are not constrained by factors that impact upon the traditional media, such as editorial control, owner or political bias or the pressures of generating commercial revenue. A consequence of the way in which these platforms have become ingrained within our social culture is that habits, conventions and social norms, that were once informal and transitory manifestations of social life, are now infused within their use. What were casual and ephemeral actions and/or acts of expression, such as conversing with friends or colleagues or swapping/displaying pictures, or exchanging thoughts that were once kept private, or maybe shared with a select few, have now become formalised and potentially permanent, on view for the world to see. Incidentally, ‘traditional’ journalists and media outlets are also utilising new media, as it allows them to react, and disseminate news, instantaneously, within a hyper-competitive marketplace. However, in a world where we are saturated, not only by citizen journalists, but by traditional media outlets, offering access to news and opinion twenty-four hours a day, via multiple new media platforms, there is increased pressure to ‘break’ news fast and first. This paper will argue that new media, and the culture and environment it has created, for citizen journalists, traditional journalists and the media generally, has altered our perceptions of the limits and boundaries of freedom of expression dramatically, and that the corollary to this seismic shift is the impact on the notion of privacy and private life. Consequently, this paper will examine what a reasonable expectation of privacy may now mean, in a new media world.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Surface modification by means of nanostructures is of interest to enhance boiling heat transfer in various applications including the organic Rankine cycle (ORC). With the goal of obtaining rough and dense aluminum oxide (Al2O3) nanofilms, the optimal combination of process parameters for electrophoretic deposition (EPD) based on the uniform design (UD) method is explored in this paper. The detailed procedures for the EPD process and UD method are presented. Four main influencing conditions controlling the EPD process were identified as nanofluid concentration, deposition time, applied voltage and suspension pH. A series of tests were carried out based on the UD experimental design. A regression model and statistical analysis were applied to the results. Sensitivity analyses of the effect of the four main parameters on the roughness and deposited mass of Al2O3 films were also carried out. The results showed that Al2O3 nanofilms were deposited compactly and uniformly on the substrate. Within the range of the experiments, the preferred combination of process parameters was determined to be nanofluid concentration of 2 wt.%, deposition time of 15 min, applied voltage of 23 V and suspension pH of 3, yielding roughness and deposited mass of 520.9 nm and 161.6 × 10− 4 g/cm2, respectively. A verification experiment was carried out at these conditions and gave values of roughness and deposited mass within 8% error of the expected ones as determined from the UD approach. It is concluded that uniform design is useful for the optimization of electrophoretic deposition requiring only 7 tests compared to 49 using the orthogonal design method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The links between operational practices and performance are well studied in the literature, both theoretically and empirically. However, mostly internal factors are inspected more closely as the basis of operational performance, even if the impact of external, environmental factors is often emphasized. Our research fills a part of this existing gap in the literature. We examine how two environmental factors, market dynamism and competition impact the use of some operational practices (such as quality improvement, product development, automation, etc.) and the resulting operations and business performance. The method of path analysis is used. Data were acquired through an international survey (IMSS – International Manufacturing Strategy Survey), which was executed in 2005, in 23 participating countries in so called "innovative" industries (ISIC 28-35) with a sample of 711 firms. Results show that both market dynamism and competition have large impact on business performance, but the indirect effects, through operations practices are rather weak compared to direct ones. The most influential practices are from the area of process and control, and quality management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increasing parental involvement was made an important goal for all Florida schools in educational reform legislation in the 1990's. A forum for this input was established and became known as the School Advisory Council (SAC). To demonstrate the importance of process and inclusion, a south Florida school district and its local teacher's union agreed on the following five goals for SACs: (a) to foster an environment of professional collaboration among all stakeholders, (b) to assist in the preparation and evaluation of the school improvement plan, (c) to address all state and district goals, (d) to serve as the avenue for authentic and representative input from all stakeholders, and (e) to ensure the continued existence of the consensus-building process on all issues related to the school's instructional program. ^ The purpose of this study was to determine to what extent and in what ways the parent members of one south Florida middle school's SAC achieved the five district goals during its first three years of implementation. The primary participants were 16 parents who served as members of the SAC, while 16 non-parent members provided perspective on parent involvement as “outside sources.” Being qualitative by design, factors such as school climate, leadership styles, and the quality of parental input were described from data collected from four sources: parent interviews, a questionnaire of non-parents, researcher observations, and relevant documents. A cross-case analysis of all data informed a process evaluation that described the similarities and differences of intended and observed outcomes of parent involvement from each source using Stake's descriptive matrix model. A formative evaluation of the process compared the observed outcomes with standards set for successful SACs, such as the district's five goals. ^ The findings indicated that parents elected to the SACs did not meet the intended goals set by the state and district. The school leadership did not foster an environment of professional collaboration and authentic decision-making for parents and other stakeholders. The overall process did not include consensus-building, and there was little if any input by parents on school improvement and other important issues relating to the instructional program. Only two parents gave the SAC a successful rating for involving parents in the decision-making process. Although compliance was met in many of the procedural transactions of the SAC, the reactions of parents to their perceived role and influence often reflected feelings of powerlessness and frustration with a process that many thought lacked meaningfulness and productivity. Two conclusions made from this study are as follows: (a) that the role of the principal in the collaborative process is pivotal, and (b) that the normative-re-educative approach to change would be most appropriate for SACs. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Currently the data storage industry is facing huge challenges with respect to the conventional method of recording data known as longitudinal magnetic recording. This technology is fast approaching a fundamental physical limit, known as the superparamagnetic limit. A unique way of deferring the superparamagnetic limit incorporates the patterning of magnetic media. This method exploits the use of lithography tools to predetermine the areal density. Various nanofabrication schemes are employed to pattern the magnetic material are Focus Ion Beam (FIB), E-beam Lithography (EBL), UV-Optical Lithography (UVL), Self-assembled Media Synthesis and Nanoimprint Lithography (NIL). Although there are many challenges to manufacturing patterned media, the large potential gains offered in terms of areal density make it one of the most promising new technologies on the horizon for future hard disk drives. Thus, this dissertation contributes to the development of future alternative data storage devices and deferring the superparamagnetic limit by designing and characterizing patterned magnetic media using a novel nanoimprint replication process called "Step and Flash Imprint lithography". As opposed to hot embossing and other high temperature-low pressure processes, SFIL can be performed at low pressure and room temperature. Initial experiments carried out, consisted of process flow design for the patterned structures on sputtered Ni-Fe thin films. The main one being the defectivity analysis for the SFIL process conducted by fabricating and testing devices of varying feature sizes (50 nm to 1 μm) and inspecting them optically as well as testing them electrically. Once the SFIL process was optimized, a number of Ni-Fe coated wafers were imprinted with a template having the patterned topography. A minimum feature size of 40 nm was obtained with varying pitch (1:1, 1:1.5, 1:2, and 1:3). The Characterization steps involved extensive SEM study at each processing step as well as Atomic Force Microscopy (AFM) and Magnetic Force Microscopy (MFM) analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation presents a system-wide approach, based on genetic algorithms, for the optimization of transfer times for an entire bus transit system. Optimization of transfer times in a transit system is a complicated problem because of the large set of binary and discrete values involved. The combinatorial nature of the problem imposes a computational burden and makes it difficult to solve by classical mathematical programming methods. ^ The genetic algorithm proposed in this research attempts to find an optimal solution for the transfer time optimization problem by searching for a combination of adjustments to the timetable for all the routes in the system. It makes use of existing scheduled timetables, ridership demand at all transfer locations, and takes into consideration the randomness of bus arrivals. ^ Data from Broward County Transit are used to compute total transfer times. The proposed genetic algorithm-based approach proves to be capable of producing substantial time savings compared to the existing transfer times in a reasonable amount of time. ^ The dissertation also addresses the issues related to spatial and temporal modeling, variability in bus arrival and departure times, walking time, as well as the integration of scheduling and ridership data. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation is an attempt to use the radical political economy approach, which assumes that there is a connection between a state's strategic interests and the interests of dominant multinational corporations (MNCs) located within a state's territory, to explain continuity in the USAID development agenda and lending patterns during the past 30 years of development aid to Haiti. Employing the qualitative method of "process-tracing," my study concludes that the radical political economy approach has an explanatory power when it comes to understanding continuity in the USAID development agenda and lending patterns during the past 30 years of development aid to Haiti. The evidence shows that USAID has implemented in Haiti, from the 1980s through the post-9/11 Washington Consensus period, neoliberal policies that conform to the political economy of US multinational corporations (US MNCs). Contrary to the claim that the USAID-sponsored post-earthquake development paradigm has departed from previous development strategies, the study has shown that USAID has used the occurrence of the January 2010 earthquake tragedy to accelerate in Haiti the implementation of a neoliberal agenda congenial to the business promotion of multinational investors, particularly US multinational corporations. In terms of the way ahead, the study argues for the implementation of a new development approach articulated by a legitimate Haitian state and primarily intended to promote the socioeconomic development of the poorest Haitians.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increasing parental involvement was made an important goal for all Florida schools in educational reform legislation in the 1990's. A forum for this input was established and became known as the School Advisory Council (SAC). To demonstrate the importance of process and inclusion, a south Florida school district and its local teacher's union agreed on the following five goals for SACs: (a) to foster an environment of professional collaboration among all stakeholders, (b) to assist in the preparation and evaluation of the school improvement plan, (c) to address all state and district goals, (d) to serve as the avenue for authentic and representative input from all stakeholders, and (e) to ensure the continued existence of the consensus-building process on all issues related to the school's instructional program. The purpose of this study was to determine to what extent and in what ways the parent members of one south Florida middle school's SAC achieved the five district goals during its first three years of implementation. The primary participants were 16 parents who served as members of the SAC, while 16 non-parent members provided perspective on parent involvement as "outside sources." Being qualitative by design, factors such as school climate, leadership styles, and the quality of parental input were described from data collected from four sources: parent interviews, a questionnaire of non-parents, researcher observations, and relevant documents. A cross-case analysis of all data informed a process evaluation that described the similarities and differences of intended and observed outcomes of parent involvement from each source using Stake's descriptive matrix model. A formative evaluation of the process compared the observed outcomes with standards set for successful SACs, such as the district's five goals. The findings indicated that parents elected to the SACs did not meet the intended goals set by the state and district. The school leadership did not foster an environment of professional collaboration and authentic decision-making for parents and other stakeholders. The overall process did not include consensus-building, and there was little if any input by parents on school improvement and other important issues relating to the instructional program. Only two parents gave the SAC a successful rating for involving parents in the decision-making process. Although compliance was met in many of the procedural transactions of the SAC, the reactions of parents to their perceived role and influence often reflected feelings of powerlessness and frustration with a process that many thought lacked meaningfulness and productivity. Two conclusions made from this study are as follows: (a) that the role of the principal in the collaborative process is pivotal, and (b) that the normative-re-educative approach to change would be most appropriate for SACs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are authentication models which use passwords, keys, personal identifiers (cards, tags etc) to authenticate a particular user in the authentication/identification process. However, there are other systems that can use biometric data, such as signature, fingerprint, voice, etc., to authenticate an individual in a system. In another hand, the storage of biometric can bring some risks such as consistency and protection problems for these data. According to this problem, it is necessary to protect these biometric databases to ensure the integrity and reliability of the system. In this case, there are models for security/authentication biometric identification, for example, models and Fuzzy Vault and Fuzzy Commitment systems. Currently, these models are mostly used in the cases for protection of biometric data, but they have fragile elements in the protection process. Therefore, increasing the level of security of these methods through changes in the structure, or even by inserting new layers of protection is one of the goals of this thesis. In other words, this work proposes the simultaneous use of encryption (Encryption Algorithm Papilio) with protection models templates (Fuzzy Vault and Fuzzy Commitment) in identification systems based on biometric. The objective of this work is to improve two aspects in Biometric systems: safety and accuracy. Furthermore, it is necessary to maintain a reasonable level of efficiency of this data through the use of more elaborate classification structures, known as committees. Therefore, we intend to propose a model of a safer biometric identification systems for identification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis investigated the risk of accidental release of hydrocarbons during transportation and storage. Transportation of hydrocarbons from an offshore platform to processing units through subsea pipelines involves risk of release due to pipeline leakage resulting from corrosion, plastic deformation caused by seabed shakedown or damaged by contact with drifting iceberg. The environmental impacts of hydrocarbon dispersion can be severe. Overall safety and economic concerns of pipeline leakage at subsea environment are immense. A large leak can be detected by employing conventional technology such as, radar, intelligent pigging or chemical tracer but in a remote location like subsea or arctic, a small chronic leak may be undetected for a period of time. In case of storage, an accidental release of hydrocarbon from the storage tank could lead pool fire; further it could escalate to domino effects. This chain of accidents may lead to extremely severe consequences. Analyzing past accident scenarios it is observed that more than half of the industrial domino accidents involved fire as a primary event, and some other factors for instance, wind speed and direction, fuel type and engulfment of the compound. In this thesis, a computational fluid dynamics (CFD) approach is taken to model the subsea pipeline leak and the pool fire from a storage tank. A commercial software package ANSYS FLUENT Workbench 15 is used to model the subsea pipeline leakage. The CFD simulation results of four different types of fluids showed that the static pressure and pressure gradient along the axial length of the pipeline have a sharp signature variation near the leak orifice at steady state condition. Transient simulation is performed to obtain the acoustic signature of the pipe near leak orifice. The power spectral density (PSD) of acoustic signal is strong near the leak orifice and it dissipates as the distance and orientation from the leak orifice increase. The high-pressure fluid flow generates more noise than the low-pressure fluid flow. In order to model the pool fire from the storage tank, ANSYS CFX Workbench 14 is used. The CFD results show that the wind speed has significant contribution on the behavior of pool fire and its domino effects. The radiation contours are also obtained from CFD post processing, which can be applied for risk analysis. The outcome of this study will be helpful for better understanding of the domino effects of pool fire in complex geometrical settings of process industries. The attempt to reduce and prevent risks is discussed based on the results obtained from the numerical simulations of the numerical models.