16 resultados para K110 Architectural Design Theory
em Digital Commons at Florida International University
Resumo:
Ensuring the correctness of software has been the major motivation in software research, constituting a Grand Challenge. Due to its impact in the final implementation, one critical aspect of software is its architectural design. By guaranteeing a correct architectural design, major and costly flaws can be caught early on in the development cycle. Software architecture design has received a lot of attention in the past years, with several methods, techniques and tools developed. However, there is still more to be done, such as providing adequate formal analysis of software architectures. On these regards, a framework to ensure system dependability from design to implementation has been developed at FIU (Florida International University). This framework is based on SAM (Software Architecture Model), an ADL (Architecture Description Language), that allows hierarchical compositions of components and connectors, defines an architectural modeling language for the behavior of components and connectors, and provides a specification language for the behavioral properties. The behavioral model of a SAM model is expressed in the form of Petri nets and the properties in first order linear temporal logic.^ This dissertation presents a formal verification and testing approach to guarantee the correctness of Software Architectures. The Software Architectures studied are expressed in SAM. For the formal verification approach, the technique applied was model checking and the model checker of choice was Spin. As part of the approach, a SAM model is formally translated to a model in the input language of Spin and verified for its correctness with respect to temporal properties. In terms of testing, a testing approach for SAM architectures was defined which includes the evaluation of test cases based on Petri net testing theory to be used in the testing process at the design level. Additionally, the information at the design level is used to derive test cases for the implementation level. Finally, a modeling and analysis tool (SAM tool) was implemented to help support the design and analysis of SAM models. The results show the applicability of the approach to testing and verification of SAM models with the aid of the SAM tool.^
Resumo:
Ensuring the correctness of software has been the major motivation in software research, constituting a Grand Challenge. Due to its impact in the final implementation, one critical aspect of software is its architectural design. By guaranteeing a correct architectural design, major and costly flaws can be caught early on in the development cycle. Software architecture design has received a lot of attention in the past years, with several methods, techniques and tools developed. However, there is still more to be done, such as providing adequate formal analysis of software architectures. On these regards, a framework to ensure system dependability from design to implementation has been developed at FIU (Florida International University). This framework is based on SAM (Software Architecture Model), an ADL (Architecture Description Language), that allows hierarchical compositions of components and connectors, defines an architectural modeling language for the behavior of components and connectors, and provides a specification language for the behavioral properties. The behavioral model of a SAM model is expressed in the form of Petri nets and the properties in first order linear temporal logic. This dissertation presents a formal verification and testing approach to guarantee the correctness of Software Architectures. The Software Architectures studied are expressed in SAM. For the formal verification approach, the technique applied was model checking and the model checker of choice was Spin. As part of the approach, a SAM model is formally translated to a model in the input language of Spin and verified for its correctness with respect to temporal properties. In terms of testing, a testing approach for SAM architectures was defined which includes the evaluation of test cases based on Petri net testing theory to be used in the testing process at the design level. Additionally, the information at the design level is used to derive test cases for the implementation level. Finally, a modeling and analysis tool (SAM tool) was implemented to help support the design and analysis of SAM models. The results show the applicability of the approach to testing and verification of SAM models with the aid of the SAM tool.
Resumo:
This thesis explores how architecture can adapt local vernacular design principles to contemporary building design in a rural setting. Vernacular buildings in Guyana present a unique and coherent set of design principles developed in response to climatic and cultural conditions. The concept of “habitus” proposed by philosopher Pierre Bourdieu describing the evolving nature of social culture was used to interpret Guyanese local buildings. These principles were then applied to the design of a Women’s Center in the village of Port Mourant on the east coast of Guyana. The design specifically interpreted the “bottom-house” of local Guyanese architecture, an inherently flexible transitional outdoor space beneath raised buildings. The design of the Women’s Center demonstrates how contemporary architectural design can respond to climatic requirements, local preferences and societal needs to support the local culture.
Resumo:
This study involves one of the eight neighborhoods in the City of Miami named Little Havana. Little Havana, once a flourishing Hispanic community during the 1960s through the 1980s, is now experiencing housing deterioration, economic disinvestment, and increased social needs. ^ Although the City developed a Community Development Plan for the neighborhood addressing the neighborhood problems, needs, and objectives, it failed to address and take advantage of the area's prominent commercial street, Calle Ocho, as a cultural catalyst for the revitalization of the neighborhood. With an urban study and understanding of the area's needs for transit system improvements, program analysis, and a valuable architectural inventory, an intervention project can be developed. The project will capitalize on the area's historical and cultural assets and serve as a step towards altering the area's decline and revitalizing the street and community to recapture the energy present during the early years of the massive Cuban migration. ^
Resumo:
Modern software systems are often large and complicated. To better understand, develop, and manage large software systems, researchers have studied software architectures that provide the top level overall structural design of software systems for the last decade. One major research focus on software architectures is formal architecture description languages, but most existing research focuses primarily on the descriptive capability and puts less emphasis on software architecture design methods and formal analysis techniques, which are necessary to develop correct software architecture design. ^ Refinement is a general approach of adding details to a software design. A formal refinement method can further ensure certain design properties. This dissertation proposes refinement methods, including a set of formal refinement patterns and complementary verification techniques, for software architecture design using Software Architecture Model (SAM), which was developed at Florida International University. First, a general guideline for software architecture design in SAM is proposed. Second, specification construction through property-preserving refinement patterns is discussed. The refinement patterns are categorized into connector refinement, component refinement and high-level Petri nets refinement. These three levels of refinement patterns are applicable to overall system interaction, architectural components, and underlying formal language, respectively. Third, verification after modeling as a complementary technique to specification refinement is discussed. Two formal verification tools, the Stanford Temporal Prover (STeP) and the Simple Promela Interpreter (SPIN), are adopted into SAM to develop the initial models. Fourth, formalization and refinement of security issues are studied. A method for security enforcement in SAM is proposed. The Role-Based Access Control model is formalized using predicate transition nets and Z notation. The patterns of enforcing access control and auditing are proposed. Finally, modeling and refining a life insurance system is used to demonstrate how to apply the refinement patterns for software architecture design using SAM and how to integrate the access control model. ^ The results of this dissertation demonstrate that a refinement method is an effective way to develop a high assurance system. The method developed in this dissertation extends existing work on modeling software architectures using SAM and makes SAM a more usable and valuable formal tool for software architecture design. ^
Resumo:
This study focuses on empirical investigations and seeks implications by utilizing three different methodologies to test various aspects of trader behavior. The first methodology utilizes Prospect Theory to determine trader behavior during periods of extreme wealth contracting periods. Secondly, a threshold model to examine the sentiment variable is formulated and thirdly a study is made of the contagion effect and trader behavior. ^ The connection between consumers' sense of financial well-being or sentiment and stock market performance has been studied at length. However, without data on actual versus experimental performance, implications based on this relationship are meaningless. The empirical agenda included examining a proprietary file of daily trader activities over a five-year period. Overall, during periods of extreme wealth altering conditions, traders "satisfice" rather than choose the "best" alternative. A trader's degree of loss aversion depends on his/her prior investment performance. A model that explains the behavior of traders during periods of turmoil is developed. Prospect Theory and the data file influenced the design of the model. ^ Additional research included testing a model that permitted the data to signal the crisis through a threshold model. The third empirical study sought to investigate the existence of contagion caused by declining global wealth effects using evidence from the mining industry in Canada. Contagion, where a financial crisis begins locally and subsequently spreads elsewhere, has been studied in terms of correlations among similar regions. The results provide support for Prospect Theory in two out of the three empirical studies. ^ The dissertation emphasizes the need for specifying precise, testable models of investors' expectations by providing tools to identify paradoxical behavior patterns. True enhancements in this field must include empirical research utilizing reliable data sources to mitigate data mining problems and allow researchers to distinguish between expectations-based and risk-based explanations of behavior. Through this type of research, it may be possible to systematically exploit "irrational" market behavior. ^
Resumo:
With the rapid growth of the Internet, computer attacks are increasing at a fast pace and can easily cause millions of dollar in damage to an organization. Detecting these attacks is an important issue of computer security. There are many types of attacks and they fall into four main categories, Denial of Service (DoS) attacks, Probe, User to Root (U2R) attacks, and Remote to Local (R2L) attacks. Within these categories, DoS and Probe attacks continuously show up with greater frequency in a short period of time when they attack systems. They are different from the normal traffic data and can be easily separated from normal activities. On the contrary, U2R and R2L attacks are embedded in the data portions of the packets and normally involve only a single connection. It becomes difficult to achieve satisfactory detection accuracy for detecting these two attacks. Therefore, we focus on studying the ambiguity problem between normal activities and U2R/R2L attacks. The goal is to build a detection system that can accurately and quickly detect these two attacks. In this dissertation, we design a two-phase intrusion detection approach. In the first phase, a correlation-based feature selection algorithm is proposed to advance the speed of detection. Features with poor prediction ability for the signatures of attacks and features inter-correlated with one or more other features are considered redundant. Such features are removed and only indispensable information about the original feature space remains. In the second phase, we develop an ensemble intrusion detection system to achieve accurate detection performance. The proposed method includes multiple feature selecting intrusion detectors and a data mining intrusion detector. The former ones consist of a set of detectors, and each of them uses a fuzzy clustering technique and belief theory to solve the ambiguity problem. The latter one applies data mining technique to automatically extract computer users’ normal behavior from training network traffic data. The final decision is a combination of the outputs of feature selecting and data mining detectors. The experimental results indicate that our ensemble approach not only significantly reduces the detection time but also effectively detect U2R and R2L attacks that contain degrees of ambiguous information.
Resumo:
The span of control is the most discussed single concept in classical and modern management theory. In specifying conditions for organizational effectiveness, the span of control has generally been regarded as a critical factor. Existing research work has focused mainly on qualitative methods to analyze this concept, for example heuristic rules based on experiences and/or intuition. This research takes a quantitative approach to this problem and formulates it as a binary integer model, which is used as a tool to study the organizational design issue. This model considers a range of requirements affecting management and supervision of a given set of jobs in a company. These decision variables include allocation of jobs to workers, considering complexity and compatibility of each job with respect to workers, and the requirement of management for planning, execution, training, and control activities in a hierarchical organization. The objective of the model is minimal operations cost, which is the sum of supervision costs at each level of the hierarchy, and the costs of workers assigned to jobs. The model is intended for application in the make-to-order industries as a design tool. It could also be applied to make-to-stock companies as an evaluation tool, to assess the optimality of their current organizational structure. Extensive experiments were conducted to validate the model, to study its behavior, and to evaluate the impact of changing parameters with practical problems. This research proposes a meta-heuristic approach to solving large-size problems, based on the concept of greedy algorithms and the Meta-RaPS algorithm. The proposed heuristic was evaluated with two measures of performance: solution quality and computational speed. The quality is assessed by comparing the obtained objective function value to the one achieved by the optimal solution. The computational efficiency is assessed by comparing the computer time used by the proposed heuristic to the time taken by a commercial software system. Test results show the proposed heuristic procedure generates good solutions in a time-efficient manner.
Resumo:
This research involves the design, development, and theoretical demonstration of models resulting in integrated misbehavior resolution protocols for ad hoc networked devices. Game theory was used to analyze strategic interaction among independent devices with conflicting interests. Packet forwarding at the routing layer of autonomous ad hoc networks was investigated. Unlike existing reputation based or payment schemes, this model is based on repeated interactions. To enforce cooperation, a community enforcement mechanism was used, whereby selfish nodes that drop packets were punished not only by the victim, but also by all nodes in the network. Then, a stochastic packet forwarding game strategy was introduced. Our solution relaxed the uniform traffic demand that was pervasive in other works. To address the concerns of imperfect private monitoring in resource aware ad hoc networks, a belief-free equilibrium scheme was developed that reduces the impact of noise in cooperation. This scheme also eliminated the need to infer the private history of other nodes. Moreover, it simplified the computation of an optimal strategy. The belief-free approach reduced the node overhead and was easily tractable. Hence it made the system operation feasible. Motivated by the versatile nature of evolutionary game theory, the assumption of a rational node is relaxed, leading to the development of a framework for mitigating routing selfishness and misbehavior in Multi hop networks. This is accomplished by setting nodes to play a fixed strategy rather than independently choosing a rational strategy. A range of simulations was carried out that showed improved cooperation between selfish nodes when compared to older results. Cooperation among ad hoc nodes can also protect a network from malicious attacks. In the absence of a central trusted entity, many security mechanisms and privacy protections require cooperation among ad hoc nodes to protect a network from malicious attacks. Therefore, using game theory and evolutionary game theory, a mathematical framework has been developed that explores trust mechanisms to achieve security in the network. This framework is one of the first steps towards the synthesis of an integrated solution that demonstrates that security solely depends on the initial trust level that nodes have for each other.^
Resumo:
People’s authentic sense of place is being overshadowed by less authentic experiences referred to as placelessness. Consequently, a demand for experiential interior environments has surfaced. Experiential environmental and place attachment theories suggested that the relationships between self, others, and the environment are what encourage users in creating meaningful authentic experiences. This qualitative study explored the roles of the experiential interior architectural features in affording users of hospitality environments higher-level needs, such as meanings of place. For the case study, ten participants stayed at a hotel for two nights. Participants were given a guided list of ten facets of an experience, which was insidiously structured by both experiential environmental and place attachment theories. The participants used photographs to document each of the facets on the guided list. The photos were then used during the photo elicitation interviews, which evoked additional qualitative information. Participants identified specific interior architectural features and described them using the themes associated to place attachment theories. The findings revealed that the interior architectural features might enrich the meanings a person associates with a given place. Possibly affording users higher-level needs. As a result, if an experiential interior environment allows users to foster relationships between self, others, and the physical environment, they may experience more authentic experiences and give more meanings to a place.
Resumo:
Widespread damage to roofing materials (such as tiles and shingles) for low-rise buildings, even for weaker hurricanes, has raised concerns regarding design load provisions and construction practices. Currently the building codes used for designing low-rise building roofs are mainly based on testing results from building models which generally do not simulate the architectural features of roofing materials that may significantly influence the wind-induced pressures. Full-scale experimentation was conducted under high winds to investigate the effects of architectural details of high profile roof tiles and asphalt shingles on net pressures that are often responsible for damage to these roofing materials. Effects on the vulnerability of roofing materials were also studied. Different roof models with bare, tiled, and shingled roof decks were tested. Pressures acting on both top and bottom surfaces of the roofing materials were measured to understand their effects on the net uplift loading. The area-averaged peak pressure coefficients obtained from bare, tiled, and shingled roof decks were compared. In addition, a set of wind tunnel tests on a tiled roof deck model were conducted to verify the effects of tiles' cavity internal pressure. Both the full-scale and the wind tunnel test results showed that underside pressure of a roof tile could either aggravate or alleviate wind uplift on the tile based on its orientation on the roof with respect to the wind angle of attack. For shingles, the underside pressure could aggravate wind uplift if the shingle is located near the center of the roof deck. Bare deck modeling to estimate design wind uplift on shingled decks may be acceptable for most locations but not for field locations; it could underestimate the uplift on shingles by 30-60%. In addition, some initial quantification of the effects of roofing materials on wind uplift was performed by studying the wind uplift load ratio for tiled versus bare deck and shingled versus bare deck. Vulnerability curves, with and without considering the effects of tiles' cavity internal pressure, showed significant differences. Aerodynamic load provisions for low-rise buildings' roofs and their vulnerability can thus be more accurately evaluated by considering the effects of the roofing materials.
Resumo:
The purpose of this study was to explore the relationship between faculty perceptions, selected demographics, implementation of elements of transactional distance theory and online web-based course completion rates. This theory posits that the high transactional distance of online courses makes it difficult for students to complete these courses successfully; too often this is associated with low completion rates. Faculty members play an indispensable role in course design, whether online or face-to-face. They also influence course delivery format from design through implementation and ultimately to how students will experience the course. This study used transactional distance theory as the conceptual framework to examine the relationship between teaching and learning strategies used by faculty members to help students complete online courses. Faculty members' sex, number of years teaching online at the college, and their online course completion rates were considered. A researcher-developed survey was used to collect data from 348 faculty members who teach online at two prominent colleges in the southeastern part of United States. An exploratory factor analysis resulted in six factors related to transactional distance theory. The factors accounted for slightly over 65% of the variance of transactional distance scores as measured by the survey instrument. Results provided support for Moore's (1993) theory of transactional distance. Female faculty members scored higher in all the factors of transactional distance theory when compared to men. Faculty number of years teaching online at the college level correlated significantly with all the elements of transactional distance theory. Regression analysis was used to determine that two of the factors, instructor interface and instructor-learner interaction, accounted for 12% of the variance in student online course completion rates. In conclusion, of the six factors found, the two with the highest percentage scores were instructor interface and instructor-learner interaction. This finding, while in alignment with the literature concerning the dialogue element of transactional distance theory, brings a special interest to the importance of instructor interface as a factor. Surprisingly, based on the reviewed literature on transactional distance theory, faculty perceptions concerning learner-learner interaction was not an important factor and there was no learner-content interaction factor.
Resumo:
FIU's campus master plan should portray an overall concept of the University's vision. Its design should represent a distinctive sense of institutional purpose. Its architecture should support the campus design in the realization of an ideal academic environment. The present master plan of Florida International University (FIU) offers neither a clear typology of architectural elements nor adequate relationships and connections between buildings. FIU needs to enhance its master plan with an architectural and urban vocabulary that creates a better environment. This thesis will examine FIU's present master plan, explaining the history of its development. Further, it will critically examine the quality of the campus, highlighting the success and failure of its various parts. The unrealized potential of the campus' original vision will be juxtaposed to the built reality. In addition, FlU's planning strategies will be parallel with the planning of several master plans of American universities. Finally, this thesis will propose a set of criteria for the inclusion of a new building in the campus master plan. The Center of International Study will be the catalyst that would bring into focus the university's vision. As a means to prove the validity of these criteria, a new location for the center of international studies will be selected, and a schematic architectural proposal will be made.
Resumo:
This study focuses on empirical investigations and seeks implications by utilizing three different methodologies to test various aspects of trader behavior. The first methodology utilizes Prospect Theory to determine trader behavior during periods of extreme wealth contracting periods. Secondly, a threshold model to examine the sentiment variable is formulated and thirdly a study is made of the contagion effect and trader behavior. The connection between consumers' sense of financial well-being or sentiment and stock market performance has been studied at length. However, without data on actual versus experimental performance, implications based on this relationship are meaningless. The empirical agenda included examining a proprietary file of daily trader activities over a five-year period. Overall, during periods of extreme wealth altering conditions, traders "satisfice" rather than choose the "best" alternative. A trader's degree of loss aversion depends on his/her prior investment performance. A model that explains the behavior of traders during periods of turmoil is developed. Prospect Theory and the data file influenced the design of the model. Additional research included testing a model that permitted the data to signal the crisis through a threshold model. The third empirical study sought to investigate the existence of contagion caused by declining global wealth effects using evidence from the mining industry in Canada. Contagion, where a financial crisis begins locally and subsequently spreads elsewhere, has been studied in terms of correlations among similar regions. The results provide support for Prospect Theory in two out of the three empirical studies. The dissertation emphasizes the need for specifying precise, testable models of investors' expectations by providing tools to identify paradoxical behavior patterns. True enhancements in this field must include empirical research utilizing reliable data sources to mitigate data mining problems and allow researchers to distinguish between expectations-based and risk-based explanations of behavior. Through this type of research, it may be possible to systematically exploit "irrational" market behavior.
Resumo:
With the rapid growth of the Internet, computer attacks are increasing at a fast pace and can easily cause millions of dollar in damage to an organization. Detecting these attacks is an important issue of computer security. There are many types of attacks and they fall into four main categories, Denial of Service (DoS) attacks, Probe, User to Root (U2R) attacks, and Remote to Local (R2L) attacks. Within these categories, DoS and Probe attacks continuously show up with greater frequency in a short period of time when they attack systems. They are different from the normal traffic data and can be easily separated from normal activities. On the contrary, U2R and R2L attacks are embedded in the data portions of the packets and normally involve only a single connection. It becomes difficult to achieve satisfactory detection accuracy for detecting these two attacks. Therefore, we focus on studying the ambiguity problem between normal activities and U2R/R2L attacks. The goal is to build a detection system that can accurately and quickly detect these two attacks. In this dissertation, we design a two-phase intrusion detection approach. In the first phase, a correlation-based feature selection algorithm is proposed to advance the speed of detection. Features with poor prediction ability for the signatures of attacks and features inter-correlated with one or more other features are considered redundant. Such features are removed and only indispensable information about the original feature space remains. In the second phase, we develop an ensemble intrusion detection system to achieve accurate detection performance. The proposed method includes multiple feature selecting intrusion detectors and a data mining intrusion detector. The former ones consist of a set of detectors, and each of them uses a fuzzy clustering technique and belief theory to solve the ambiguity problem. The latter one applies data mining technique to automatically extract computer users’ normal behavior from training network traffic data. The final decision is a combination of the outputs of feature selecting and data mining detectors. The experimental results indicate that our ensemble approach not only significantly reduces the detection time but also effectively detect U2R and R2L attacks that contain degrees of ambiguous information.