12 resultados para Requirements Engineering, Requirement Specification
em Digital Commons at Florida International University
Resumo:
The aorta has been viewed as a passive distribution manifold for blood whose elasticity allows it to store blood during cardiac ejection (systole), and release it during relaxation (diastole). This capacitance, or compliance, lowers peak cardiac work input and maintains peripheral sanguine irrigation throughout the cardiac cycle. The compliance of the human and canine circulatory systems have been described either as constant throughout the cycle (Toy et al. 1985) or as some inverse function of pressure (Li et al. 1990, Cappelo et al. 1995). This work shows that a compliance value that is higher during systole than diastole (equivalent to a direct function of pressure) leads to a reduction in the energetic input to the cardiovascular system (CV), even when accounting for the energy required to change compliance. This conclusion is obtained numerically, based on a 3-element lumped-parameter model of the CV, then demonstrated in a physical model built for the purpose. It is then shown, based on the numerical and physical models, on analytical considerations of elastic tubes, and on the analysis of arterial volume as a function of pressure measured in vivo (Armentano et al. 1995), that the mechanical effects of a presupposed arterial contraction are consistent with those of energetically beneficial changes in compliance during the cardiac cycle. Although the amount of energy potentially saved with rhythmically contracting arteries is small (mean 0.55% for the cases studied) the importance of the phenomenon lies in its possible relation to another function of the arterial smooth muscle (ASM): synthesis of wall matrix macromolecules. It is speculated that a reduction in the rate of collagen synthesis by the ASM is implicated in the formation of arteriosclerosis. ^
Resumo:
The predictions contained within this dissertation suggest further rapid growth of the cruise industry and the requirement for additional cruise ship berthing worldwide. The factors leading to the tremendous growth in the cruise marketplace are identified and individually addressed. Unfortunately, planning factors associated with the design and construction of cruise ship seaports are not readily available and methods to manage this growth have not been addressed. This dissertation provides accurate and consolidated planning factors essential for comprehensive consideration of cruise ship requirements and design of growing cruise ship ports. The consolidation of these factors results in faster and better informed choices for the port owner/operator with regard to port expansion. Furthermore, this dissertation proposes development of new systems to better manage increasing passenger and ship traffic. If implemented, this will result in optimized port systems providing a greater level of service to passengers and port authorities while simultaneously minimizing environmental and economic impact. ^
Resumo:
Mediation techniques provide interoperability and support integrated query processing among heterogeneous databases. While such techniques help data sharing among different sources, they increase the risk for data security, such as violating access control rules. Successful protection of information by an effective access control mechanism is a basic requirement for interoperation among heterogeneous data sources. ^ This dissertation first identified the challenges in the mediation system in order to achieve both interoperability and security in the interconnected and collaborative computing environment, which includes: (1) context-awareness, (2) semantic heterogeneity, and (3) multiple security policy specification. Currently few existing approaches address all three security challenges in mediation system. This dissertation provides a modeling and architectural solution to the problem of mediation security that addresses the aforementioned security challenges. A context-aware flexible authorization framework was developed in the dissertation to deal with security challenges faced by mediation system. The authorization framework consists of two major tasks, specifying security policies and enforcing security policies. Firstly, the security policy specification provides a generic and extensible method to model the security policies with respect to the challenges posed by the mediation system. The security policies in this study are specified by 5-tuples followed by a series of authorization constraints, which are identified based on the relationship of the different security components in the mediation system. Two essential features of mediation systems, i. e., relationship among authorization components and interoperability among heterogeneous data sources, are the focus of this investigation. Secondly, this dissertation supports effective access control on mediation systems while providing uniform access for heterogeneous data sources. The dynamic security constraints are handled in the authorization phase instead of the authentication phase, thus the maintenance cost of security specification can be reduced compared with related solutions. ^
Resumo:
Software architecture is the abstract design of a software system. It plays a key role as a bridge between requirements and implementation, and is a blueprint for development. The architecture represents a set of early design decisions that are crucial to a system. Mistakes in those decisions are very costly if they remain undetected until the system is implemented and deployed. This is where formal specification and analysis fits in. Formal specification makes sure that an architecture design is represented in a rigorous and unambiguous way. Furthermore, a formally specified model allows the use of different analysis techniques for verifying the correctness of those crucial design decisions. ^ This dissertation presented a framework, called SAM, for formal specification and analysis of software architectures. In terms of specification, formalisms and mechanisms were identified and chosen to specify software architecture based on different analysis needs. Formalisms for specifying properties were also explored, especially in the case of non-functional properties. In terms of analysis, the dissertation explored both the verification of functional properties and the evaluation of non-functional properties of software architecture. For the verification of functional property, methodologies were presented on how to apply existing model checking techniques on a SAM model. For the evaluation of non-functional properties, the dissertation first showed how to incorporate stochastic information into a SAM model, and then explained how to translate the model to existing tools and conducts the analysis using those tools. ^ To alleviate the analysis work, we also provided a tool to automatically translate a SAM model for model checking. All the techniques and methods described in the dissertation were illustrated by examples or case studies, which also served a purpose of advocating the use of formal methods in practice. ^
Resumo:
This research aimed at developing a research framework for the emerging field of enterprise systems engineering (ESE). The framework consists of an ESE definition, an ESE classification scheme, and an ESE process. This study views an enterprise as a system that creates value for its customers. Thus, developing the framework made use of system theory and IDEF methodologies. This study defined ESE as an engineering discipline that develops and applies systems theory and engineering techniques to specification, analysis, design, and implementation of an enterprise for its life cycle. The proposed ESE classification scheme breaks down an enterprise system into four elements. They are work, resources, decision, and information. Each enterprise element is specified with four system facets: strategy, competency, capacity, and structure. Each element-facet combination is subject to the engineering process of specification, analysis, design, and implementation, to achieve its pre-specified performance with respect to cost, time, quality, and benefit to the enterprise. This framework is intended for identifying research voids in the ESE discipline. It also helps to apply engineering and systems tools to this emerging field. It harnesses the relationships among various enterprise aspects and bridges the gap between engineering and management practices in an enterprise. The proposed ESE process is generic. It consists of a hierarchy of engineering activities presented in an IDEF0 model. Each activity is defined with its input, output, constraints, and mechanisms. The output of an ESE effort can be a partial or whole enterprise system design for its physical, managerial, and/or informational layers. The proposed ESE process is applicable to a new enterprise system design or an engineering change in an existing system. The long-term goal of this study aims at development of a scientific foundation for ESE research and development.
Resumo:
Current artificial heart valves are classified as mechanical and bioprosthetic. An appealing pathway that promises to overcome the shortcomings of commercially available heart valves is offered by the interdisciplinary approach of cardiovascular tissue engineering. However, the mechanical properties of the Tissue Engineering Heart Valves (TEHV) are limited and generally fail in the long-term use. To meet this performance challenge novel biodegradable triblock copolymer poly(ethylene oxide)-polypropylene oxide)-poly(ethylene oxide) (PEO-PPO-PEO or F108) crosslinked to Silk Fibroin (F108-SilkC) to be used as tri-leaflet heart valve material was investigated. ^ Synthesis of ten polymers with varying concentration and thickness (55 µm, 75 µm and 100 µm) was achieved via a covalent crosslinking scheme using bifunctional polyethylene glycol diglycidyl ether (PEGDE). Static and fatigue testing were used to assess mechanical properties of films, and hydrodynamic testing was performed to determine performance under a simulated left ventricular flow regime. The crosslinked copolymer (F108-Silk C) showed greater flexibility and resilience, but inferior ultimate tensile strength, by increasing concentration of PEGDE. Concentration molar ratio of 80:1 (F108: Silk) and thickness of 75 µm showed longer fatigue life for both tension-tension and bending fatigue tests. Four valves out of twelve designed satisfactorily complied with minimum performance requirement ISO 5840, 2005. ^ In conclusion, it was demonstrated that the applicability of a degradable polymer in conjugation with silk fibroin for tissue engineering cardiovascular use, specifically for aortic valve leaflet design, met the performance demands. Thinner thicknesses (t<75 µm) in conjunction with stiffness lower than 320 MPa (80:1, F108: Silk) are essential for the correct functionality of proposed heart valve biomaterial F108-SilkC. Fatigue tests were demonstrated to be a useful tool to characterize biomaterials that undergo cyclic loading. ^
Resumo:
This research addresses the problem of cost estimation for product development in engineer-to-order (ETO) operations. An ETO operation starts the product development process with a product specification and ends with delivery of a rather complicated, highly customized product. ETO operations are practiced in various industries such as engineering tooling, factory plants, industrial boilers, pressure vessels, shipbuilding, bridges and buildings. ETO views each product as a delivery item in an industrial project and needs to make an accurate estimation of its development cost at the bidding and/or planning stage before any design or manufacturing activity starts. ^ Many ETO practitioners rely on an ad hoc approach to cost estimation, with use of past projects as reference, adapting them to the new requirements. This process is often carried out on a case-by-case basis and in a non-procedural fashion, thus limiting its applicability to other industry domains and transferability to other estimators. In addition to being time consuming, this approach usually does not lead to an accurate cost estimate, which varies from 30% to 50%. ^ This research proposes a generic cost modeling methodology for application in ETO operations across various industry domains. Using the proposed methodology, a cost estimator will be able to develop a cost estimation model for use in a chosen ETO industry in a more expeditious, systematic and accurate manner. ^ The development of the proposed methodology was carried out by following the meta-methodology as outlined by Thomann. Deploying the methodology, cost estimation models were created in two industry domains (building construction and the steel milling equipment manufacturing). The models are then applied to real cases; the cost estimates are significantly more accurate than the actual estimates, with mean absolute error rate of 17.3%. ^ This research fills an important need of quick and accurate cost estimation across various ETO industries. It differs from existing approaches to the problem in that a methodology is developed for use to quickly customize a cost estimation model for a chosen application domain. In addition to more accurate estimation, the major contributions are in its transferability to other users and applicability to different ETO operations. ^
Resumo:
The span of control is the most discussed single concept in classical and modern management theory. In specifying conditions for organizational effectiveness, the span of control has generally been regarded as a critical factor. Existing research work has focused mainly on qualitative methods to analyze this concept, for example heuristic rules based on experiences and/or intuition. This research takes a quantitative approach to this problem and formulates it as a binary integer model, which is used as a tool to study the organizational design issue. This model considers a range of requirements affecting management and supervision of a given set of jobs in a company. These decision variables include allocation of jobs to workers, considering complexity and compatibility of each job with respect to workers, and the requirement of management for planning, execution, training, and control activities in a hierarchical organization. The objective of the model is minimal operations cost, which is the sum of supervision costs at each level of the hierarchy, and the costs of workers assigned to jobs. The model is intended for application in the make-to-order industries as a design tool. It could also be applied to make-to-stock companies as an evaluation tool, to assess the optimality of their current organizational structure. Extensive experiments were conducted to validate the model, to study its behavior, and to evaluate the impact of changing parameters with practical problems. This research proposes a meta-heuristic approach to solving large-size problems, based on the concept of greedy algorithms and the Meta-RaPS algorithm. The proposed heuristic was evaluated with two measures of performance: solution quality and computational speed. The quality is assessed by comparing the obtained objective function value to the one achieved by the optimal solution. The computational efficiency is assessed by comparing the computer time used by the proposed heuristic to the time taken by a commercial software system. Test results show the proposed heuristic procedure generates good solutions in a time-efficient manner.
Resumo:
The primary purpose of this study was to investigate agreement among five equations by which clinicians estimate water requirements (EWR) and to determine how well these equations predict total water intake (TWI). The Institute of Medicine has used TWI as a measure of water requirements. A secondary goal of this study was to develop practical equations to predict TWI. These equations could then be considered accurate predictors of an individual’s water requirement. ^ Regressions were performed to determine agreement between the five equations and between the five equations and TWI using NHANES 1999–2004. The criteria for agreement was (1) strong correlation coefficients between all comparisons and (2) regression line that was not significantly different when compared to the line of equality (x=y) i.e., the 95% CI of the slope and intercept must include one and zero, respectively. Correlations were performed to determine association between fat-free mass (FFM) and TWI. Clinically significant variables were selected to build equations for predicting TWI. All analyses were performed with SAS software and were weighted to account for the complex survey design and for oversampling. ^ Results showed that the five EWR equations were strongly correlated but did not agree with each other. Further, the EWR equations were all weakly associated to TWI and lacked agreement with TWI. The strongest agreement between the NRC equation and TWI explained only 8.1% of the variability of TWI. Fat-free mass was positively correlated to TWI. Two models were created to predict TWI. Both models included the variables, race/ethnicity, kcals, age, and height, but one model also included FFM and gender. The other model included BMI and osmolality. Neither model accounted for more than 28% of the variability of TWI. These results provide evidence that estimates of water requirements would vary depending upon which EWR equation was selected by the clinician. None of the existing EWR equations predicted TWI, nor could a prediction equation be created which explained a satisfactory amount of variance in TWI. A good estimate of water requirements may not be predicted by TWI. Future research should focus on using more valid measures to predict water requirements.^
Resumo:
This research aimed at developing a research framework for the emerging field of enterprise systems engineering (ESE). The framework consists of an ESE definition, an ESE classification scheme, and an ESE process. This study views an enterprise as a system that creates value for its customers. Thus, developing the framework made use of system theory and IDEF methodologies. This study defined ESE as an engineering discipline that develops and applies systems theory and engineering techniques to specification, analysis, design, and implementation of an enterprise for its life cycle. The proposed ESE classification scheme breaks down an enterprise system into four elements. They are work, resources, decision, and information. Each enterprise element is specified with four system facets: strategy, competency, capacity, and structure. Each element-facet combination is subject to the engineering process of specification, analysis, design, and implementation, to achieve its pre-specified performance with respect to cost, time, quality, and benefit to the enterprise. This framework is intended for identifying research voids in the ESE discipline. It also helps to apply engineering and systems tools to this emerging field. It harnesses the relationships among various enterprise aspects and bridges the gap between engineering and management practices in an enterprise. The proposed ESE process is generic. It consists of a hierarchy of engineering activities presented in an IDEF0 model. Each activity is defined with its input, output, constraints, and mechanisms. The output of an ESE effort can be a partial or whole enterprise system design for its physical, managerial, and/or informational layers. The proposed ESE process is applicable to a new enterprise system design or an engineering change in an existing system. The long-term goal of this study aims at development of a scientific foundation for ESE research and development.
Resumo:
The increasing nationwide interest in intelligent transportation systems (ITS) and the need for more efficient transportation have led to the expanding use of variable message sign (VMS) technology. VMS panels are substantially heavier than flat panel aluminum signs and have a larger depth (dimension parallel to the direction of traffic). The additional weight and depth can have a significant effect on the aerodynamic forces and inertial loads transmitted to the support structure. The wind induced drag forces and the response of VMS structures is not well understood. Minimum design requirements for VMS structures are contained in the American Association of State Highway Transportation Officials Standard Specification for Structural Support for Highway Signs, Luminaires, and Traffic Signals (AASHTO Specification). However the Specification does not take into account the prismatic geometry of VMS and the complex interaction of the applied aerodynamic forces to the support structure. In view of the lack of code guidance and the limited number research performed so far, targeted experimentation and large scale testing was conducted at the Florida International University (FIU) Wall of Wind (WOW) to provide reliable drag coefficients and investigate the aerodynamic instability of VMS. A comprehensive range of VMS geometries was tested in turbulence representative of the high frequency end of the spectrum in a simulated suburban atmospheric boundary layer. The mean normal, lateral and vertical lift force coefficients, in addition to the twisting moment coefficient and eccentricity ratio, were determined using the measured data for each model. Wind tunnel testing confirmed that drag on a prismatic VMS is smaller than the 1.7 suggested value in the current AASHTO Specification (2013). An alternative to the AASHTO Specification code value is presented in the form of a design matrix. Testing and analysis also indicated that vortex shedding oscillations and galloping instability could be significant for VMS signs with a large depth ratio attached to a structure with a low natural frequency. The effect of corner modification was investigated by testing models with chamfered and rounded corners. Results demonstrated an additional decrease in the drag coefficient but a possible Reynolds number dependency for the rounded corner configuration.
Resumo:
Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.