905 resultados para Development of large software systems,
Resumo:
This thesis presents the experimental investigation into two novel techniques which can be incorporated into current optical systems. These techniques have the capability to improve the performance of transmission and the recovery of the transmitted signal at the receiver. The experimental objectives are described and the results for each technique are presented in two sections: The first experimental section is on work related to Ultra-long Raman Fibre lasers (ULRFLs). The fibre lasers have become an important research topic in recent years due to the significant improvement they give over lumped Raman amplification and their potential use in the development of system with large bandwidths and very low losses. The experiments involved the use of ASK and DPSK modulation types over a distance of 240km and DPSK over a distance of 320km. These results are compared to the current state of-the-art and against other types of ultra-long transmission amplification techniques. The second technique investigated involves asymmetrical, or offset, filtering. This technique is important because it deals with the strong filtering regimes that are a part of optical systems and networks in modern high-speed communications. It allows the improvement of the received signal by offsetting the central frequency of a filter after the output of a Delay Line Interferometer (DLI), which induces significant improvement in BER and/or Qvalues at the receiver and therefore an increase in signal quality. The experimental results are then concluded against the objectives of the experimental work and potential future work discussed.
Resumo:
This article describes the approach, which allows to develop information systems without taking into consideration details of physical storage of the relational model and type database management system. Described in terms of graph model, this approach allows to construct several algorithms, for example, for verification application domain. This theory was introduced into operation testing as a part of CASE-system METAS.
Resumo:
This paper considers the problem of concept generalization in decision-making systems where such features of real-world databases as large size, incompleteness and inconsistence of the stored information are taken into account. The methods of the rough set theory (like lower and upper approximations, positive regions and reducts) are used for the solving of this problem. The new discretization algorithm of the continuous attributes is proposed. It essentially increases an overall performance of generalization algorithms and can be applied to processing of real value attributes in large data tables. Also the search algorithm of the significant attributes combined with a stage of discretization is developed. It allows avoiding splitting of continuous domains of insignificant attributes into intervals.
Resumo:
The article presents a new type of logs merging tool for multiple blade telecommunication systems based on the development of a new approach. The introduction of the new logs merging tool (the Log Merger) can help engineers to build a processes behavior timeline with a flexible system of information structuring used to assess the changes in the analyzed system. This logs merging system based on the experts experience and their analytical skills generates a knowledge base which could be advantageous in further decision-making expert system development. This paper proposes and discusses the design and implementation of the Log Merger, its architecture, multi-board analysis of capability and application areas. The paper also presents possible ways of further tool improvement e.g. - to extend its functionality and cover additional system platforms. The possibility to add an analysis module for further expert system development is also considered.
Resumo:
As machine tools continue to become increasingly repeatable and accurate, high-precision manufacturers may be tempted to consider how they might utilise machine tools as measurement systems. In this paper, we have explored this paradigm by attempting to repurpose state-of-the-art coordinate measuring machine Uncertainty Evaluating Software (UES) for a machine tool application. We performed live measurements on all the systems in question. Our findings have highlighted some gaps with UES when applied to machine tools, and we have attempted to identify the sources of variation which have led to discrepancies. Implications of this research include requirements to evolve the algorithms within the UES if it is to be adapted for on-machine measurement, improve the robustness of the input parameters, and most importantly, clarify expectations.
Resumo:
A tanulmány középpontjában a szolgálatosodás folyamata, vagy más néven az átfogó megoldásokat kínáló integrált termék-szolgáltatás rendszerek kialakulása áll. Áttekintjük a szolgálatosodás XIX. századra visszanyúló kialakulásának tényezőit, és a jelenlegi vállalatok előtt álló fejlődési lehetőségeket. Foglalkozunk e rendszerekhez szükséges képességek kérdéseivel és a sikeres termék-szolgáltatás rendszerek kialakításának folyamataival. Az irodalmi összefoglalás célja, hogy a vállalati üzletfejlesztéssel foglalkozó szakembereknek, a vállalati vezetőknek ötleteket adjon a sikeres fejlődéshez és egyben a lehetséges kockázatok elkerüléséhez. = The emerging theme of servitization, or in other words, the integrated product-service systems providing complex solutions to customer demand are in the focus of this study. We overview the factors leading to servitization, and highlight the improvement opportunities in this field. The capabilities required and the development steps of successful servitization are also addressed. The objective of this short literature review is to provide ideas for business development experts and top managers on how to develop their business successfully and how to avoid risks in this development.
Resumo:
The ultimate intent of this dissertation was to broaden and strengthen our understanding of IT implementation by emphasizing research efforts on the dynamic nature of the implementation process. More specifically, efforts were directed toward opening the "black box" and providing the story that explains how and why contextual conditions and implementation tactics interact to produce project outcomes. In pursuit of this objective, the dissertation was aimed at theory building and adopted a case study methodology combining qualitative and quantitative evidence. Precisely, it examined the implementation process, use and consequences of three clinical information systems at Jackson Memorial Hospital, a large tertiary care teaching hospital.^ As a preliminary step toward the development of a more realistic model of system implementation, the study proposes a new set of research propositions reflecting the dynamic nature of the implementation process.^ Findings clearly reveal that successful implementation projects are likely to be those where key actors envision end goals, anticipate challenges ahead, and recognize the presence of and seize opportunities. It was also found that IT implementation is characterized by the systems theory of equifinality, that is, there are likely several equally effective ways to achieve a given end goal. The selection of a particular implementation strategy appears to be a rational process where actions and decisions are largely influenced by the degree to which key actors recognize the mediating role of each tactic and are motivated to action. The nature of the implementation process is also characterized by the concept of "duality of structure," that is, context and actions mutually influence each other. Another key finding suggests that there is no underlying program that regulates the process of change and moves it form one given point toward a subsequent and already prefigured end. For this reason, the implementation process cannot be thought of as a series of activities performed in a sequential manner such as conceived in stage models. Finally, it was found that IT implementation is punctuated by a certain indeterminacy. Results suggest that only when substantial efforts are focused on what to look for and think about, it is less likely that unfavorable and undesirable consequences will occur. ^
Resumo:
A methodology for formally modeling and analyzing software architecture of mobile agent systems provides a solid basis to develop high quality mobile agent systems, and the methodology is helpful to study other distributed and concurrent systems as well. However, it is a challenge to provide the methodology because of the agent mobility in mobile agent systems.^ The methodology was defined from two essential parts of software architecture: a formalism to define the architectural models and an analysis method to formally verify system properties. The formalism is two-layer Predicate/Transition (PrT) nets extended with dynamic channels, and the analysis method is a hierarchical approach to verify models on different levels. The two-layer modeling formalism smoothly transforms physical models of mobile agent systems into their architectural models. Dynamic channels facilitate the synchronous communication between nets, and they naturally capture the dynamic architecture configuration and agent mobility of mobile agent systems. Component properties are verified based on transformed individual components, system properties are checked in a simplified system model, and interaction properties are analyzed on models composing from involved nets. Based on the formalism and the analysis method, this researcher formally modeled and analyzed a software architecture of mobile agent systems, and designed an architectural model of a medical information processing system based on mobile agents. The model checking tool SPIN was used to verify system properties such as reachability, concurrency and safety of the medical information processing system. ^ From successful modeling and analyzing the software architecture of mobile agent systems, the conclusion is that PrT nets extended with channels are a powerful tool to model mobile agent systems, and the hierarchical analysis method provides a rigorous foundation for the modeling tool. The hierarchical analysis method not only reduces the complexity of the analysis, but also expands the application scope of model checking techniques. The results of formally modeling and analyzing the software architecture of the medical information processing system show that model checking is an effective and an efficient way to verify software architecture. Moreover, this system shows a high level of flexibility, efficiency and low cost of mobile agent technologies. ^
Resumo:
The nation's freeway systems are becoming increasingly congested. A major contribution to traffic congestion on freeways is due to traffic incidents. Traffic incidents are non-recurring events such as accidents or stranded vehicles that cause a temporary roadway capacity reduction, and they can account for as much as 60 percent of all traffic congestion on freeways. One major freeway incident management strategy involves diverting traffic to avoid incident locations by relaying timely information through Intelligent Transportation Systems (ITS) devices such as dynamic message signs or real-time traveler information systems. The decision to divert traffic depends foremost on the expected duration of an incident, which is difficult to predict. In addition, the duration of an incident is affected by many contributing factors. Determining and understanding these factors can help the process of identifying and developing better strategies to reduce incident durations and alleviate traffic congestion. A number of research studies have attempted to develop models to predict incident durations, yet with limited success. ^ This dissertation research attempts to improve on this previous effort by applying data mining techniques to a comprehensive incident database maintained by the District 4 ITS Office of the Florida Department of Transportation (FDOT). Two categories of incident duration prediction models were developed: "offline" models designed for use in the performance evaluation of incident management programs, and "online" models for real-time prediction of incident duration to aid in the decision making of traffic diversion in the event of an ongoing incident. Multiple data mining analysis techniques were applied and evaluated in the research. The multiple linear regression analysis and decision tree based method were applied to develop the offline models, and the rule-based method and a tree algorithm called M5P were used to develop the online models. ^ The results show that the models in general can achieve high prediction accuracy within acceptable time intervals of the actual durations. The research also identifies some new contributing factors that have not been examined in past studies. As part of the research effort, software code was developed to implement the models in the existing software system of District 4 FDOT for actual applications. ^
Resumo:
Existing instrumental techniques must be adaptable to the analysis of novel explosives if science is to keep up with the practices of terrorists and criminals. The focus of this work has been the development of analytical techniques for the analysis of two types of novel explosives: ascorbic acid-based propellants, and improvised mixtures of concentrated hydrogen peroxide/fuel. In recent years, the use of these explosives in improvised explosive devices (IEDs) has increased. It is therefore important to develop methods which permit the identification of the nature of the original explosive from post-blast residues. Ascorbic acid-based propellants are low explosives which employ an ascorbic acid fuel source with a nitrate/perchlorate oxidizer. A method which utilized ion chromatography with indirect photometric detection was optimized for the analysis of intact propellants. Post-burn and post-blast residues if these propellants were analyzed. It was determined that the ascorbic acid fuel and nitrate oxidizer could be detected in intact propellants, as well as in the post-burn and post-blast residues. Degradation products of the nitrate and perchlorate oxidizers were also detected. With a quadrupole time-of-flight mass spectrometer (QToFMS), exact mass measurements are possible. When an HPLC instrument is coupled to a QToFMS, the combination of retention time with accurate mass measurements, mass spectral fragmentation information, and isotopic abundance patterns allows for the unequivocal identification of a target analyte. An optimized HPLC-ESI-QToFMS method was applied to the analysis of ascorbic acid-based propellants. Exact mass measurements were collected for the fuel and oxidizer anions, and their degradation products. Ascorbic acid was detected in the intact samples and half of the propellants subjected to open burning; the intact fuel molecule was not detected in any of the post-blast residue. Two methods were optimized for the analysis of trace levels of hydrogen peroxide: HPLC with fluorescence detection (HPLC-FD), and HPLC with electrochemical detection (HPLC-ED). Both techniques were extremely selective for hydrogen peroxide. Both methods were applied to the analysis of post-blast debris from improvised mixtures of concentrated hydrogen peroxide/fuel; hydrogen peroxide was detected on variety of substrates. Hydrogen peroxide was detected in the post-blast residues of the improvised explosives TATP and HMTD.
Resumo:
There are situations in which it is very important to quickly and positively identify an individual. Examples include suspects detained in the neighborhood of a bombing or terrorist incident, individuals detained attempting to enter or leave the country, and victims of mass disasters. Systems utilized for these purposes must be fast, portable, and easy to maintain. The goal of this project was to develop an ultra fast, direct PCR method for forensic genotyping of oral swabs. The procedure developed eliminates the need for cellular digestion and extraction of the sample by performing those steps in the PCR tube itself. Then, special high-speed polymerases are added which are capable of amplifying a newly developed 7 loci multiplex in under 16 minutes. Following the amplification, a postage stamp sized microfluidic device equipped with specially designed entangled polymer separation matrix, yields a complete genotype in 80 seconds. The entire process is rapid and reliable, reducing the time from sample to genotype from 1-2 days to under 20 minutes. Operation requires minimal equipment and can be easily performed with a small high-speed thermal-cycler, reagents, and a microfluidic device with a laptop. The system was optimized and validated using a number of test parameters and a small test population. The overall precision was better than 0.17 bp and provided a power of discrimination greater than 1 in 106. The small footprint, and ease of use will permit this system to be an effective tool to quickly screen and identify individuals detained at ports of entry, police stations and remote locations. The system is robust, portable and demonstrates to the forensic community a simple solution to the problem of rapid determination of genetic identity.
Resumo:
The main objective for physics based modeling of the power converter components is to design the whole converter with respect to physical and operational constraints. Therefore, all the elements and components of the energy conversion system are modeled numerically and combined together to achieve the whole system behavioral model. Previously proposed high frequency (HF) models of power converters are based on circuit models that are only related to the parasitic inner parameters of the power devices and the connections between the components. This dissertation aims to obtain appropriate physics-based models for power conversion systems, which not only can represent the steady state behavior of the components, but also can predict their high frequency characteristics. The developed physics-based model would represent the physical device with a high level of accuracy in predicting its operating condition. The proposed physics-based model enables us to accurately develop components such as; effective EMI filters, switching algorithms and circuit topologies [7]. One of the applications of the developed modeling technique is design of new sets of topologies for high-frequency, high efficiency converters for variable speed drives. The main advantage of the modeling method, presented in this dissertation, is the practical design of an inverter for high power applications with the ability to overcome the blocking voltage limitations of available power semiconductor devices. Another advantage is selection of the best matching topology with inherent reduction of switching losses which can be utilized to improve the overall efficiency. The physics-based modeling approach, in this dissertation, makes it possible to design any power electronic conversion system to meet electromagnetic standards and design constraints. This includes physical characteristics such as; decreasing the size and weight of the package, optimized interactions with the neighboring components and higher power density. In addition, the electromagnetic behaviors and signatures can be evaluated including the study of conducted and radiated EMI interactions in addition to the design of attenuation measures and enclosures.
Resumo:
Bonded repair of concrete structures with fiber reinforced polymer (FRP) systems is increasingly being accepted as a cost-efficient and structurally viable method of rapid rehabilitation of concrete structures. However, the relationships between long-term performance attributes, service-life, and details of the installation process are not easy to quantify. Accordingly, there is currently a lack of generally accepted construction specifications, making it difficult for the field engineer to certify the adequacy of the construction process. ^ The objective of the present study, as part of the National Cooperative Highway Research Program (NCHRP) Project 10-59B, was to investigate the effect of surface preparation on the behavior of wet lay-up FRP repair systems and consequently develop rational thresholds that provide sufficient performance. ^ The research program was comprised of both experimental and analytical work for wet lay-up FRP applications. The experimental work included flexure testing of sixty-seven (67) reinforced concrete beams and bond testing of ten (10) reinforced concrete blocks. Four different parameters were studied: surface roughness, surface flatness, surface voids and bug holes, and surface cracks/cuts. The findings were analyzed from various aspects and compared with the data available in the literature. As part of the analytical work, finite element models of the flexural specimens with surface flaws were developed using ANSYS. The purpose of this part was to extend the parametric study on the effects of concrete surface flaws and verify the experimental results based on nonlinear finite element analysis. ^ Test results showed that surface roughness does not appear to have a significant influence on the overall performance of the wet lay-up FRP systems with or without adequate anchorage, and whether failure was by debonding or rupture of FRP. Both experimental and analytical results for surface flatness proved that peaks on concrete surface, in the range studied, do not have a significant effect on the performance of wet lay-up FRP systems. However, valleys of particular size could reduce the strength of wet lay-up FRP systems. Test results regarding surface voids and surface cracks/cuts revealed that previously suggested thresholds for these flaws appear to be conservative, as also confirmed by analytical study. ^
Resumo:
Low-rise buildings are often subjected to high wind loads during hurricanes that lead to severe damage and cause water intrusion. It is therefore important to estimate accurate wind pressures for design purposes to reduce losses. Wind loads on low-rise buildings can differ significantly depending upon the laboratory in which they were measured. The differences are due in large part to inadequate simulations of the low-frequency content of atmospheric velocity fluctuations in the laboratory and to the small scale of the models used for the measurements. A new partial turbulence simulation methodology was developed for simulating the effect of low-frequency flow fluctuations on low-rise buildings more effectively from the point of view of testing accuracy and repeatability than is currently the case. The methodology was validated by comparing aerodynamic pressure data for building models obtained in the open-jet 12-Fan Wall of Wind (WOW) facility against their counterparts in a boundary-layer wind tunnel. Field measurements of pressures on Texas Tech University building and Silsoe building were also used for validation purposes. The tests in partial simulation are freed of integral length scale constraints, meaning that model length scales in such testing are only limited by blockage considerations. Thus the partial simulation methodology can be used to produce aerodynamic data for low-rise buildings by using large-scale models in wind tunnels and WOW-like facilities. This is a major advantage, because large-scale models allow for accurate modeling of architectural details, testing at higher Reynolds number, using greater spatial resolution of the pressure taps in high pressure zones, and assessing the performance of aerodynamic devices to reduce wind effects. The technique eliminates a major cause of discrepancies among measurements conducted in different laboratories and can help to standardize flow simulations for testing residential homes as well as significantly improving testing accuracy and repeatability. Partial turbulence simulation was used in the WOW to determine the performance of discontinuous perforated parapets in mitigating roof pressures. The comparisons of pressures with and without parapets showed significant reductions in pressure coefficients in the zones with high suctions. This demonstrated the potential of such aerodynamic add-on devices to reduce uplift forces.