887 resultados para Set design


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Small errors proved catastrophic. Our purpose to remark that a very small cause which escapes our notice determined a considerable effect that we cannot fail to see, and then we say that the effect is due to chance. Small differences in the initial conditions produce very great ones in the final phenomena. A small error in the former will produce an enormous error in the latter. When dealing with any kind of electrical device specification, it is important to note that there exists a pair of test conditions that define a test: the forcing function and the limit. Forcing functions define the external operating constraints placed upon the device tested. The actual test defines how well the device responds to these constraints. Forcing inputs to threshold for example, represents the most difficult testing because this put those inputs as close as possible to the actual switching critical points and guarantees that the device will meet the Input-Output specifications. ^ Prediction becomes impossible by classical analytical analysis bounded by Newton and Euclides. We have found that non linear dynamics characteristics is the natural state of being in all circuits and devices. Opportunities exist for effective error detection in a nonlinear dynamics and chaos environment. ^ Nowadays there are a set of linear limits established around every aspect of a digital or analog circuits out of which devices are consider bad after failing the test. Deterministic chaos circuit is a fact not a possibility as it has been revived by our Ph.D. research. In practice for linear standard informational methodologies, this chaotic data product is usually undesirable and we are educated to be interested in obtaining a more regular stream of output data. ^ This Ph.D. research explored the possibilities of taking the foundation of a very well known simulation and modeling methodology, introducing nonlinear dynamics and chaos precepts, to produce a new error detector instrument able to put together streams of data scattered in space and time. Therefore, mastering deterministic chaos and changing the bad reputation of chaotic data as a potential risk for practical system status determination. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Semantic Binary Data Model (SBM) is a viable alternative to the now-dominant relational data model. SBM would be especially advantageous for applications dealing with complex interrelated networks of objects provided that a robust efficient implementation can be achieved. This dissertation presents an implementation design method for SBM, algorithms, and their analytical and empirical evaluation. Our method allows building a robust and flexible database engine with a wider applicability range and improved performance. ^ Extensions to SBM are introduced and an implementation of these extensions is proposed that allows the database engine to efficiently support applications with a predefined set of queries. A New Record data structure is proposed. Trade-offs of employing Fact, Record and Bitmap Data structures for storing information in a semantic database are analyzed. ^ A clustering ID distribution algorithm and an efficient algorithm for object ID encoding are proposed. Mapping to an XML data model is analyzed and a new XML-based XSDL language facilitating interoperability of the system is defined. Solutions to issues associated with making the database engine multi-platform are presented. An improvement to the atomic update algorithm suitable for certain scenarios of database recovery is proposed. ^ Specific guidelines are devised for implementing a robust and well-performing database engine based on the extended Semantic Data Model. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern software systems are often large and complicated. To better understand, develop, and manage large software systems, researchers have studied software architectures that provide the top level overall structural design of software systems for the last decade. One major research focus on software architectures is formal architecture description languages, but most existing research focuses primarily on the descriptive capability and puts less emphasis on software architecture design methods and formal analysis techniques, which are necessary to develop correct software architecture design. ^ Refinement is a general approach of adding details to a software design. A formal refinement method can further ensure certain design properties. This dissertation proposes refinement methods, including a set of formal refinement patterns and complementary verification techniques, for software architecture design using Software Architecture Model (SAM), which was developed at Florida International University. First, a general guideline for software architecture design in SAM is proposed. Second, specification construction through property-preserving refinement patterns is discussed. The refinement patterns are categorized into connector refinement, component refinement and high-level Petri nets refinement. These three levels of refinement patterns are applicable to overall system interaction, architectural components, and underlying formal language, respectively. Third, verification after modeling as a complementary technique to specification refinement is discussed. Two formal verification tools, the Stanford Temporal Prover (STeP) and the Simple Promela Interpreter (SPIN), are adopted into SAM to develop the initial models. Fourth, formalization and refinement of security issues are studied. A method for security enforcement in SAM is proposed. The Role-Based Access Control model is formalized using predicate transition nets and Z notation. The patterns of enforcing access control and auditing are proposed. Finally, modeling and refining a life insurance system is used to demonstrate how to apply the refinement patterns for software architecture design using SAM and how to integrate the access control model. ^ The results of this dissertation demonstrate that a refinement method is an effective way to develop a high assurance system. The method developed in this dissertation extends existing work on modeling software architectures using SAM and makes SAM a more usable and valuable formal tool for software architecture design. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the rapid growth of the Internet, computer attacks are increasing at a fast pace and can easily cause millions of dollar in damage to an organization. Detecting these attacks is an important issue of computer security. There are many types of attacks and they fall into four main categories, Denial of Service (DoS) attacks, Probe, User to Root (U2R) attacks, and Remote to Local (R2L) attacks. Within these categories, DoS and Probe attacks continuously show up with greater frequency in a short period of time when they attack systems. They are different from the normal traffic data and can be easily separated from normal activities. On the contrary, U2R and R2L attacks are embedded in the data portions of the packets and normally involve only a single connection. It becomes difficult to achieve satisfactory detection accuracy for detecting these two attacks. Therefore, we focus on studying the ambiguity problem between normal activities and U2R/R2L attacks. The goal is to build a detection system that can accurately and quickly detect these two attacks. In this dissertation, we design a two-phase intrusion detection approach. In the first phase, a correlation-based feature selection algorithm is proposed to advance the speed of detection. Features with poor prediction ability for the signatures of attacks and features inter-correlated with one or more other features are considered redundant. Such features are removed and only indispensable information about the original feature space remains. In the second phase, we develop an ensemble intrusion detection system to achieve accurate detection performance. The proposed method includes multiple feature selecting intrusion detectors and a data mining intrusion detector. The former ones consist of a set of detectors, and each of them uses a fuzzy clustering technique and belief theory to solve the ambiguity problem. The latter one applies data mining technique to automatically extract computer users’ normal behavior from training network traffic data. The final decision is a combination of the outputs of feature selecting and data mining detectors. The experimental results indicate that our ensemble approach not only significantly reduces the detection time but also effectively detect U2R and R2L attacks that contain degrees of ambiguous information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The span of control is the most discussed single concept in classical and modern management theory. In specifying conditions for organizational effectiveness, the span of control has generally been regarded as a critical factor. Existing research work has focused mainly on qualitative methods to analyze this concept, for example heuristic rules based on experiences and/or intuition. This research takes a quantitative approach to this problem and formulates it as a binary integer model, which is used as a tool to study the organizational design issue. This model considers a range of requirements affecting management and supervision of a given set of jobs in a company. These decision variables include allocation of jobs to workers, considering complexity and compatibility of each job with respect to workers, and the requirement of management for planning, execution, training, and control activities in a hierarchical organization. The objective of the model is minimal operations cost, which is the sum of supervision costs at each level of the hierarchy, and the costs of workers assigned to jobs. The model is intended for application in the make-to-order industries as a design tool. It could also be applied to make-to-stock companies as an evaluation tool, to assess the optimality of their current organizational structure. Extensive experiments were conducted to validate the model, to study its behavior, and to evaluate the impact of changing parameters with practical problems. This research proposes a meta-heuristic approach to solving large-size problems, based on the concept of greedy algorithms and the Meta-RaPS algorithm. The proposed heuristic was evaluated with two measures of performance: solution quality and computational speed. The quality is assessed by comparing the obtained objective function value to the one achieved by the optimal solution. The computational efficiency is assessed by comparing the computer time used by the proposed heuristic to the time taken by a commercial software system. Test results show the proposed heuristic procedure generates good solutions in a time-efficient manner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Biologists often need to assess whether unfamiliar datasets warrant the time investment required for more detailed exploration. Basing such assessments on brief descriptions provided by data publishers is unwieldy for large datasets that contain insights dependent on specific scientific questions. Alternatively, using complex software systems for a preliminary analysis may be deemed as too time consuming in itself, especially for unfamiliar data types and formats. This may lead to wasted analysis time and discarding of potentially useful data. Results: We present an exploration of design opportunities that the Google Maps interface offers to biomedical data visualization. In particular, we focus on synergies between visualization techniques and Google Maps that facilitate the development of biological visualizations which have both low-overhead and sufficient expressivity to support the exploration of data at multiple scales. The methods we explore rely on displaying pre-rendered visualizations of biological data in browsers, with sparse yet powerful interactions, by using the Google Maps API. We structure our discussion around five visualizations: a gene co-regulation visualization, a heatmap viewer, a genome browser, a protein interaction network, and a planar visualization of white matter in the brain. Feedback from collaborative work with domain experts suggests that our Google Maps visualizations offer multiple, scale-dependent perspectives and can be particularly helpful for unfamiliar datasets due to their accessibility. We also find that users, particularly those less experienced with computer use, are attracted by the familiarity of the Google Maps API. Our five implementations introduce design elements that can benefit visualization developers. Conclusions: We describe a low-overhead approach that lets biologists access readily analyzed views of unfamiliar scientific datasets. We rely on pre-computed visualizations prepared by data experts, accompanied by sparse and intuitive interactions, and distributed via the familiar Google Maps framework. Our contributions are an evaluation demonstrating the validity and opportunities of this approach, a set of design guidelines benefiting those wanting to create such visualizations, and five concrete example visualizations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

FIU's campus master plan should portray an overall concept of the University's vision. Its design should represent a distinctive sense of institutional purpose. Its architecture should support the campus design in the realization of an ideal academic environment. The present master plan of Florida International University (FIU) offers neither a clear typology of architectural elements nor adequate relationships and connections between buildings. FIU needs to enhance its master plan with an architectural and urban vocabulary that creates a better environment. This thesis will examine FIU's present master plan, explaining the history of its development. Further, it will critically examine the quality of the campus, highlighting the success and failure of its various parts. The unrealized potential of the campus' original vision will be juxtaposed to the built reality. In addition, FlU's planning strategies will be parallel with the planning of several master plans of American universities. Finally, this thesis will propose a set of criteria for the inclusion of a new building in the campus master plan. The Center of International Study will be the catalyst that would bring into focus the university's vision. As a means to prove the validity of these criteria, a new location for the center of international studies will be selected, and a schematic architectural proposal will be made.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the rapid growth of the Internet, computer attacks are increasing at a fast pace and can easily cause millions of dollar in damage to an organization. Detecting these attacks is an important issue of computer security. There are many types of attacks and they fall into four main categories, Denial of Service (DoS) attacks, Probe, User to Root (U2R) attacks, and Remote to Local (R2L) attacks. Within these categories, DoS and Probe attacks continuously show up with greater frequency in a short period of time when they attack systems. They are different from the normal traffic data and can be easily separated from normal activities. On the contrary, U2R and R2L attacks are embedded in the data portions of the packets and normally involve only a single connection. It becomes difficult to achieve satisfactory detection accuracy for detecting these two attacks. Therefore, we focus on studying the ambiguity problem between normal activities and U2R/R2L attacks. The goal is to build a detection system that can accurately and quickly detect these two attacks. In this dissertation, we design a two-phase intrusion detection approach. In the first phase, a correlation-based feature selection algorithm is proposed to advance the speed of detection. Features with poor prediction ability for the signatures of attacks and features inter-correlated with one or more other features are considered redundant. Such features are removed and only indispensable information about the original feature space remains. In the second phase, we develop an ensemble intrusion detection system to achieve accurate detection performance. The proposed method includes multiple feature selecting intrusion detectors and a data mining intrusion detector. The former ones consist of a set of detectors, and each of them uses a fuzzy clustering technique and belief theory to solve the ambiguity problem. The latter one applies data mining technique to automatically extract computer users’ normal behavior from training network traffic data. The final decision is a combination of the outputs of feature selecting and data mining detectors. The experimental results indicate that our ensemble approach not only significantly reduces the detection time but also effectively detect U2R and R2L attacks that contain degrees of ambiguous information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis explores how architecture can adapt local vernacular design principles to contemporary building design in a rural setting. Vernacular buildings in Guyana present a unique and coherent set of design principles developed in response to climatic and cultural conditions. The concept of “habitus” proposed by philosopher Pierre Bourdieu describing the evolving nature of social culture was used to interpret Guyanese local buildings. These principles were then applied to the design of a Women’s Center in the village of Port Mourant on the east coast of Guyana. The design specifically interpreted the “bottom-house” of local Guyanese architecture, an inherently flexible transitional outdoor space beneath raised buildings. The design of the Women’s Center demonstrates how contemporary architectural design can respond to climatic requirements, local preferences and societal needs to support the local culture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cloud computing enables independent end users and applications to share data and pooled resources, possibly located in geographically distributed Data Centers, in a fully transparent way. This need is particularly felt by scientific applications to exploit distributed resources in efficient and scalable way for the processing of big amount of data. This paper proposes an open so- lution to deploy a Platform as a service (PaaS) over a set of multi- site data centers by applying open source virtualization tools to facilitate operation among virtual machines while optimizing the usage of distributed resources. An experimental testbed is set up in Openstack environment to obtain evaluations with different types of TCP sample connections to demonstrate the functionality of the proposed solution and to obtain throughput measurements in relation to relevant design parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Magnetic field inhomogeneity results in image artifacts including signal loss, image blurring and distortions, leading to decreased diagnostic accuracy. Conventional multi-coil (MC) shimming method employs both RF coils and shimming coils, whose mutual interference induces a tradeoff between RF signal-to-noise (SNR) ratio and shimming performance. To address this issue, RF coils were integrated with direct-current (DC) shim coils to shim field inhomogeneity while concurrently emitting and receiving RF signal without being blocked by the shim coils. The currents applied to the new coils, termed iPRES (integrated parallel reception, excitation and shimming), were optimized in the numerical simulation to improve the shimming performance. The objectives of this work is to offer a guideline for designing the optimal iPRES coil arrays to shim the abdomen.

In this thesis work, the main field () inhomogeneity was evaluated by root mean square error (RMSE). To investigate the shimming abilities of iPRES coil arrays, a set of the human abdomen MRI data was collected for the numerical simulations. Thereafter, different simplified iPRES(N) coil arrays were numerically modeled, including a 1-channel iPRES coil and 8-channel iPRES coil arrays. For 8-channel iPRES coil arrays, each RF coil was split into smaller DC loops in the x, y and z direction to provide extra shimming freedom. Additionally, the number of DC loops in a RF coil was increased from 1 to 5 to find the optimal divisions in z direction. Furthermore, switches were numerically implemented into iPRES coils to reduce the number of power supplies while still providing similar shimming performance with equivalent iPRES coil arrays.

The optimizations demonstrate that the shimming ability of an iPRES coil array increases with number of DC loops per RF coil. Furthermore, the z direction divisions tend to be more effective in reducing field inhomogeneity than the x and y divisions. Moreover, the shimming performance of an iPRES coil array gradually reach to a saturation level when the number of DC loops per RF coil is large enough. Finally, when switches were numerically implemented in the iPRES(4) coil array, the number of power supplies can be reduced from 32 to 8 while keeping the shimming performance similar to iPRES(3) and better than iPRES(1). This thesis work offers a guidance for the designs of iPRES coil arrays.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The observation chart is for many health professionals (HPs) the primary source of objective information relating to the health of a patient. Information Systems (IS) research has demonstrated the positive impact of good interface design on decision making and it is logical that good observation chart design can positively impact healthcare decision making. Despite the potential for good observation chart design, there is a paucity of observation chart design literature, with the primary source of literature leveraging Human Computer Interaction (HCI) literature to design better charts. While this approach has been successful, this design approach introduces a gap between understanding of the tasks performed by HPs when using charts and the design features implemented in the chart. Good IS allow for the collection and manipulation of data so that it can be presented in a timely manner that support specific tasks. Good interface design should therefore consider the specific tasks being performed prior to designing the interface. This research adopts a Design Science Research (DSR) approach to formalise a framework of design principles that incorporates knowledge of the tasks performed by HPs when using observation charts and knowledge pertaining to visual representations of data and semiology of graphics. This research is presented in three phases, the initial two phases seek to discover and formalise design knowledge embedded in two situated observation charts: the paper-based NEWS chart developed by the Health Service Executive in Ireland and the electronically generated eNEWS chart developed by the Health Information Systems Research Centre in University College Cork. A comparative evaluation of each chart is also presented in the respective phases. Throughout each of these phases, tentative versions of a design framework for electronic vital sign observation charts are presented, with each subsequent iteration of the framework (versions Alpha, Beta, V0.1 and V1.0) representing a refinement of the design knowledge. The design framework will be named the framework for the Retrospective Evaluation of Vital Sign Information from Early Warning Systems (REVIEWS). Phase 3 of the research presents the deductive process for designing and implementing V0.1 of the framework, with evaluation of the instantiation allowing for the final iteration V1.0 of the framework. This study makes a number of contributions to academic research. First the research demonstrates that the cognitive tasks performed by nurses during clinical reasoning can be supported through good observation chart design. Secondly the research establishes the utility of electronic vital sign observation charts in terms of supporting the cognitive tasks performed by nurses during clinical reasoning. Third the framework for REVIEWS represents a comprehensive set of design principles which if applied to chart design will improve the usefulness of the chart in terms of supporting clinical reasoning. Fourth the electronic observation chart that emerges from this research is demonstrated to be significantly more useful than previously designed charts and represents a significant contribution to practice. Finally the research presents a research design that employs a combination of inductive and deductive design activities to iterate on the design of situated artefacts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, novel analog-to-digital and digital-to-analog generalized time-interleaved variable bandpass sigma-delta modulators are designed, analysed, evaluated and implemented that are suitable for high performance data conversion for a broad-spectrum of applications. These generalized time-interleaved variable bandpass sigma-delta modulators can perform noise-shaping for any centre frequency from DC to Nyquist. The proposed topologies are well-suited for Butterworth, Chebyshev, inverse-Chebyshev and elliptical filters, where designers have the flexibility of specifying the centre frequency, bandwidth as well as the passband and stopband attenuation parameters. The application of the time-interleaving approach, in combination with these bandpass loop-filters, not only overcomes the limitations that are associated with conventional and mid-band resonator-based bandpass sigma-delta modulators, but also offers an elegant means to increase the conversion bandwidth, thereby relaxing the need to use faster or higher-order sigma-delta modulators. A step-by-step design technique has been developed for the design of time-interleaved variable bandpass sigma-delta modulators. Using this technique, an assortment of lower- and higher-order single- and multi-path generalized A/D variable bandpass sigma-delta modulators were designed, evaluated and compared in terms of their signal-to-noise ratios, hardware complexity, stability, tonality and sensitivity for ideal and non-ideal topologies. Extensive behavioural-level simulations verified that one of the proposed topologies not only used fewer coefficients but also exhibited greater robustness to non-idealties. Furthermore, second-, fourth- and sixth-order single- and multi-path digital variable bandpass digital sigma-delta modulators are designed using this technique. The mathematical modelling and evaluation of tones caused by the finite wordlengths of these digital multi-path sigmadelta modulators, when excited by sinusoidal input signals, are also derived from first principles and verified using simulation and experimental results. The fourth-order digital variable-band sigma-delta modulator topologies are implemented in VHDL and synthesized on Xilinx® SpartanTM-3 Development Kit using fixed-point arithmetic. Circuit outputs were taken via RS232 connection provided on the FPGA board and evaluated using MATLAB routines developed by the author. These routines included the decimation process as well. The experiments undertaken by the author further validated the design methodology presented in the work. In addition, a novel tunable and reconfigurable second-order variable bandpass sigma-delta modulator has been designed and evaluated at the behavioural-level. This topology offers a flexible set of choices for designers and can operate either in single- or dual-mode enabling multi-band implementations on a single digital variable bandpass sigma-delta modulator. This work is also supported by a novel user-friendly design and evaluation tool that has been developed in MATLAB/Simulink that can speed-up the design, evaluation and comparison of analog and digital single-stage and time-interleaved variable bandpass sigma-delta modulators. This tool enables the user to specify the conversion type, topology, loop-filter type, path number and oversampling ratio.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper reports on a study of design studio culture from a student perspective. Learning in design studio culture has been theorised variously as a signature pedagogy emulating professional practice models, as a community of practice and as a form of problem-based learning, all largely based on the study of teaching events in studio. The focus of this research has extended beyond formally recognized activities to encompass the student’s experience of their social and community networks, working places and study set-ups, to examine how these have contributed to studio culture and how there have been supported by studio teaching. Semi-structured interviews with final year undergraduate students of architecture formed the basis of the study using an interpretivist approach informed by Actor-network theory, with studio culture featured as the focal actor, enrolling students and engaging with other actors, together constituting an actor-network of studio culture. The other actors included social community patterns and activities; the numerous working spaces (including but not limited to the studio space itself); the equipment, tools of trade and material pre-requisites for working; the portfolio enrolling the other actors to produce work for it; and the various formal and informal events associated with the course itself. Studio culture is a highly charged social arena: The question is how, and in particular, which aspects of it support learning? Theoretical models of situated learning and communities of practice models have informed the analysis, with Bourdieu’s theory of practice, and his interrelated concepts of habitus, field and capital providing a means of relating individually acquired habits and modes of working to social contexts. Bourdieu’s model of habitus involves the externalisation through the social realm of habits and knowledge previously internalised. It is therefore a useful model for considering whole individual learning activities; shared repertoires and practices located in the social realm. The social milieu of the studio provides a scene for the exercise and display of ‘practicing’ and the accumulation of a form of ‘practicing-capital’. This capital is a property of the social milieu rather than the space, so working or practicing in the company of others (in space and through social media) becomes a more valued aspect of studio than space or facilities alone. This practicing-capital involves the acquisition of a habitus of studio culture, with the transformation of physical practices or habits into social dispositions, acquiring social capital (driving the social milieu) and cultural capital (practicing-knowledge) in the process. The research drew on students’ experiences, and their practicing ‘getting a feel for the game’ by exploring the limits or boundaries of the field of studio culture. The research demonstrated that a notional studio community was in effect a social context for supporting learning; a range of settings to explore and test out newly internalised knowledge, demonstrate or display ideas, modes of thinking and practicing. The study presents a nuanced interpretation of how students relate to a studio culture that involves a notional community, and a developing habitus within a field of practicing that extends beyond teaching scenarios.