989 resultados para system requirement
Resumo:
Biochemical studies with model DNA heteroduplexes have implicated RecJ exonuclease, exonuclease VII, exonuclease I, and exonuclease X in Escherichia coli methyl-directed mismatch correction. However, strains deficient in the four exonucleases display only a modest increase in mutation rate, raising questions concerning involvement of these activities in mismatch repair in vivo. The quadruple mutant deficient in the four exonucleases, as well as the triple mutant deficient in RecJ exonuclease, exonuclease VII, and exonuclease I, grow poorly in the presence of the base analogue 2-aminopurine, and exposure to the base analogue results in filament formation, indicative of induction of SOS DNA damage response. The growth defect and filamentation phenotypes associated with 2-aminopurine exposure are effectively suppressed by null mutations in mutH, mutL, mutS, or uvrD/mutU, which encode activities that act upstream of the four exonucleases in the mechanism for the methyl-directed reaction that has been proposed based on in vitro studies. The quadruple exonuclease mutant is also cold-sensitive, having a severe growth defect at 30°C. This phenotype is suppressed by a uvrD/mutU defect, and partially suppressed by mutH, mutL, or mutS mutations. These observations confirm involvement of the four exonucleases in methyl-directed mismatch repair in vivo and suggest that the low mutability of exonuclease-deficient strains is a consequence of under recovery of mutants due to a reduction in viability and/or chromosome loss associated with activation of the mismatch repair system in the absence of RecJ exonuclease, exonuclease VII, exonuclease I, and exonuclease X.
Resumo:
The human general transcription factor TFIIA is one of several factors involved in specific transcription by RNA polymerase II, possibly by regulating the activity of the TATA-binding subunit (TBP) of TFIID. TFIIA purified from HeLa extracts consists of 35-, 19-, and 12-kDa subunits. Here we describe the isolation of a cDNA clone (hTFIIA gamma) encoding the 12-kDa subunit. Using expression constructs derived from hTFIIA gamma and TFIIA alpha/beta (which encodes a 55-kDa precursor to the alpha and beta subunits of natural TFIIA), we have constructed a synthetic TFIIA with a polypeptide composition similar to that of natural TFIIA. The recombinant complex supports the formation of a DNA-TBP-TFIIA complex and mediates both basal and Gal4-VP16-activated transcription by RNA polymerase II in TFIIA-depleted nuclear extracts. In contrast, TFIIA has no effect on tRNA and 5S RNA transcription by RNA polymerase III in this system. We also present evidence that both the p55 and p12 recombinant subunits interact with TBP and that the basic region of TBP is critical for the TFIIA-dependent function of TBP in nuclear extracts.
Resumo:
The non-use provisions of the Trade Marks Act 1995 (Cth) have attracted some attention in recent reviews of the trade marks system and some reform of these provisions now seems likely. Unfortunately, however, there has been a failure to confront the full range of problems that hamper the effectiveness of the current non-use provisions. Once these problems are properly understood, it can be seen that more wide-reaching reforms than those being canvassed at present merit serious consideration.
Resumo:
User requirements of multimedia authentication are various. In some cases, the user requires an authentication system to monitor a set of specific areas with respective sensitivity while neglecting other modification. Most current existing fragile watermarking schemes are mixed systems, which can not satisfy accurate user requirements. Therefore, in this paper we designed a sensor-based multimedia authentication architecture. This system consists of sensor combinations and a fuzzy response logic system. A sensor is designed to strictly respond to given area tampering of a certain type. With this scheme, any complicated authentication requirement can be satisfied, and many problems such as error tolerant tamper method detection will be easily resolved. We also provided experiments to demonstrate the implementation of the sensor-based system
Resumo:
The mechanism of muscle protein catabolism induced by proteolysis-inducing factor, produced by cachexia-inducing murine and human tumours has been studied in vitro using C2C12 myoblasts and myotubes. In both myoblasts and myotubes protein degradation was enhanced by proteolysis-inducing factor after 24 h incubation. In myoblasts this followed a bell-shaped dose-response curve with maximal effects at a proteolysis-inducing factor concentration between 2 and 4 nM, while in myotubes increased protein degradation was seen at all concentrations of proteolysis-inducing factor up to 10 nM, again with a maximum of 4 nM proteolysis-inducing factor. Protein degradation induced by proteolysis-inducing factor was completely attenuated in the presence of cycloheximide (1 μM), suggesting a requirement for new protein synthesis. In both myoblasts and myotubes protein degradation was accompanied by an increased expression of the α-type subunits of the 20S proteasome as well as functional activity of the proteasome, as determined by the 'chymotrypsin-like' enzyme activity. There was also an increased expression of the 19S regulatory complex as well as the ubiquitin-conjugating enzyme (E214k), and in myotubes a decrease in myosin expression was seen with increasing concentrations of proteolysis-inducing factor. These results show that proteolysis-inducing factor co-ordinately upregulates both ubiquitin conjugation and proteasome activity in both myoblasts and myotubes and may play an important role in the muscle wasting seen in cancer cachexia. © 2002 Cancer Research UK.
Resumo:
The absence of a definitive approach to the design of manufacturing systems signifies the importance of a control mechanism to ensure the timely application of relevant design techniques. To provide effective control, design development needs to be continually assessed in relation to the required system performance, which can only be achieved analytically through computer simulation. The technique providing the only method of accurately replicating the highly complex and dynamic interrelationships inherent within manufacturing facilities and realistically predicting system behaviour. Owing to the unique capabilities of computer simulation, its application should support and encourage a thorough investigation of all alternative designs. Allowing attention to focus specifically on critical design areas and enabling continuous assessment of system evolution. To achieve this system analysis needs to efficient, in terms of data requirements and both speed and accuracy of evaluation. To provide an effective control mechanism a hierarchical or multi-level modelling procedure has therefore been developed, specifying the appropriate degree of evaluation support necessary at each phase of design. An underlying assumption of the proposal being that evaluation is quick, easy and allows models to expand in line with design developments. However, current approaches to computer simulation are totally inappropriate to support the hierarchical evaluation. Implementation of computer simulation through traditional approaches is typically characterized by a requirement for very specialist expertise, a lengthy model development phase, and a correspondingly high expenditure. Resulting in very little and rather inappropriate use of the technique. Simulation, when used, is generally only applied to check or verify a final design proposal. Rarely is the full potential of computer simulation utilized to aid, support or complement the manufacturing system design procedure. To implement the proposed modelling procedure therefore the concept of a generic simulator was adopted, as such systems require no specialist expertise, instead facilitating quick and easy model creation, execution and modification, through simple data inputs. Previously generic simulators have tended to be too restricted, lacking the necessary flexibility to be generally applicable to manufacturing systems. Development of the ATOMS manufacturing simulator, however, has proven that such systems can be relevant to a wide range of applications, besides verifying the benefits of multi-level modelling.
Resumo:
A study on heat pump thermodynamic characteristics has been made in the laboratory on a specially designed and instrumented air to water heat pump system. The design, using refrigerant R12, was based on the requirement to produce domestic hot water at a temperature of about 50 °C and was assembled in the laboratory. All the experimental data were fed to a microcomputer and stored on disk automatically from appropriate transducers via amplifier and 16 channel analogue to digital converters. The measurements taken were R12 pressures and temperatures, water and R12 mass flow rates, air speed, fan and compressor input powers, water and air inlet and outlet temperatures, wet and dry bulb temperatures. The time interval between the observations could be varied. The results showed, as expected, that the COP was higher at higher air inlet temperatures and at lower hot water output temperatures. The optimum air speed was found to be at a speed when the fan input power was about 4% of the condenser heat output. It was also found that the hot water can be produced at a temperature higher than the appropriate R12 condensing temperature corresponding to condensing pressure. This was achieved by condenser design to take advantage of discharge superheat and by further heating the water using heat recovery from the compressor. Of the input power to the compressor, typically about 85% was transferred to the refrigerant, 50 % by the compression work and 35% due to the heating of the refrigerant by the cylinder wall, and the remaining 15% (of the input power) was rejected to the cooling medium. The evaporator effectiveness was found to be about 75% and sensitive to the air speed. Using the data collected, a steady state computer model was developed. For given input conditions s air inlet temperature, air speed, the degree of suction superheat , water inlet and outlet temperatures; the model is capable of predicting the refrigerant cycle, compressor efficiency, evaporator effectiveness, condenser water flow rate and system Cop.
Resumo:
OBJECTIVES: The objective of this research was to design a clinical decision support system (CDSS) that supports heterogeneous clinical decision problems and runs on multiple computing platforms. Meeting this objective required a novel design to create an extendable and easy to maintain clinical CDSS for point of care support. The proposed solution was evaluated in a proof of concept implementation. METHODS: Based on our earlier research with the design of a mobile CDSS for emergency triage we used ontology-driven design to represent essential components of a CDSS. Models of clinical decision problems were derived from the ontology and they were processed into executable applications during runtime. This allowed scaling applications' functionality to the capabilities of computing platforms. A prototype of the system was implemented using the extended client-server architecture and Web services to distribute the functions of the system and to make it operational in limited connectivity conditions. RESULTS: The proposed design provided a common framework that facilitated development of diversified clinical applications running seamlessly on a variety of computing platforms. It was prototyped for two clinical decision problems and settings (triage of acute pain in the emergency department and postoperative management of radical prostatectomy on the hospital ward) and implemented on two computing platforms-desktop and handheld computers. CONCLUSIONS: The requirement of the CDSS heterogeneity was satisfied with ontology-driven design. Processing of application models described with the help of ontological models allowed having a complex system running on multiple computing platforms with different capabilities. Finally, separation of models and runtime components contributed to improved extensibility and maintainability of the system.
Resumo:
Self-adaptive systems have the capability to autonomously modify their behavior at run-time in response to changes in their environment. Self-adaptation is particularly necessary for applications that must run continuously, even under adverse conditions and changing requirements; sample domains include automotive systems, telecommunications, and environmental monitoring systems. While a few techniques have been developed to support the monitoring and analysis of requirements for adaptive systems, limited attention has been paid to the actual creation and specification of requirements of self-adaptive systems. As a result, self-adaptivity is often constructed in an ad-hoc manner. In order to support the rigorous specification of adaptive systems requirements, this paper introduces RELAX, a new requirements language for self-adaptive systems that explicitly addresses uncertainty inherent in adaptive systems. We present the formal semantics for RELAX in terms of fuzzy logic, thus enabling a rigorous treatment of requirements that include uncertainty. RELAX enables developers to identify uncertainty in the requirements, thereby facilitating the design of systems that are, by definition, more flexible and amenable to adaptation in a systematic fashion. We illustrate the use of RELAX on smart home applications, including an adaptive assisted living system.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT One of the current research trends in Enterprise Resource Planning (ERP) involves examining the critical factors for its successful implementation. However, such research is limited to system implementation, not focusing on the flexibility of ERP to respond to changes in business. Therefore, this study explores a combination system, made up of an ERP and informality, intended to provide organisations with efficient and flexible performance simultaneously. In addition, this research analyses the benefits and challenges of using the system. The research was based on socio-technical system (STS) theory which contains two dimensions: 1) a technical dimension which evaluates the performance of the system; and 2) a social dimension which examines the impact of the system on an organisation. A mixed method approach has been followed in this research. The qualitative part aims to understand the constraints of using a single ERP system, and to define a new system corresponding to these problems. To achieve this goal, four Chinese companies operating in different industries were studied, all of which faced challenges in using an ERP system due to complexity and uncertainty in their business environments. The quantitative part contains a discrete-event simulation study that is intended to examine the impact of operational performance when a company implements the hybrid system in a real-life situation. Moreover, this research conducts a further qualitative case study, the better to understand the influence of the system in an organisation. The empirical aspect of the study reveals that an ERP with pre-determined business activities cannot react promptly to unanticipated changes in a business. Incorporating informality into an ERP can react to different situations by using different procedures that are based on the practical knowledge of frontline employees. Furthermore, the simulation study shows that the combination system can achieve a balance between efficiency and flexibility. Unlike existing research, which emphasises a continuous improvement in the IT functions of an enterprise system, this research contributes to providing a definition of a new system in theory, which has mixed performance and contains both the formal practices embedded in an ERP and informal activities based on human knowledge. It supports both cost-efficiency in executing business transactions and flexibility in coping with business uncertainty.This research also indicates risks of using the system, such as using an ERP with limited functions; a high cost for performing informally; and a low system acceptance, owing to a shift in organisational culture. With respect to practical contribution, this research suggests that companies can choose the most suitable enterprise system approach in accordance with their operational strategies. The combination system can be implemented in a company that needs to operate a medium amount of volume and variety. By contrast, the traditional ERP system is better suited in a company that operates a high-level volume market, while an informal system is more suitable for a firm with a requirement for a high level of variety.
Resumo:
In this paper we experimentally demonstrate a 10 Mb/s error free visible light communications (VLC) system using polymer light-emitting diodes (PLEDs) for the first time. The PLED under test is a blue emitter with ∼600 kHz bandwidth. Having such a low bandwidth means the introduction of an intersymbol interference (ISI) induced penalty at higher transmission speeds and thus the requirement for an equalizer. In this work we improve on previous literature by implementing a decision feedback equalizer, rather than a linear equalizer. Considering 7% and 20% forward error correction codes, transmission speeds up to ∼12 Mb/s can be supported.
Resumo:
A műhelytanulmány annak a kutatási munkának az első eredménye, amelyben a rendszerváltás után nemzetközileg sikeressé vált magyar vállalatok sikertényezőinek elemzését tűztük ki célul. A kérdés jelentőségét abban látjuk, hogy a magyar piac korlátozott belső nagysága miatt a sikeres magyar vállalatok számára a külföldön való terjeszkedés megkerülhetetlen stratégiai követelmény. A mélyinterjúkon alapuló kutatáshoz szakirodalmi áttekintést végeztünk, összeállítottuk kutatási propozíciónkat, kidolgoztuk a mélyinterjúknál használandó kérdéslistát. Ez a műhelytanulmány az első két mélyinterjú anyagát tartalmazza, de kidolgozás és előkészítés alatt vannak további vállalati esettanulmányok is. ----- This working paper presents the first results of our research project aiming to analyze the success factors of Hungarian companies that became successful internationally after the change in the economic system in Hungary. We believe this is a significant topic as international expansion for successful Hungarian companies is a strategic requirement, due to the limited size of domestic markets. Our research is methodologically based on in-depth interviews, backed by literature review. We have developed propositions and an outline for semi-structured interviews. This working paper covers the recorded material of two in-depth interviews of executives from two different companies, while additional case studies are still being prepared.
Resumo:
Mediation techniques provide interoperability and support integrated query processing among heterogeneous databases. While such techniques help data sharing among different sources, they increase the risk for data security, such as violating access control rules. Successful protection of information by an effective access control mechanism is a basic requirement for interoperation among heterogeneous data sources. ^ This dissertation first identified the challenges in the mediation system in order to achieve both interoperability and security in the interconnected and collaborative computing environment, which includes: (1) context-awareness, (2) semantic heterogeneity, and (3) multiple security policy specification. Currently few existing approaches address all three security challenges in mediation system. This dissertation provides a modeling and architectural solution to the problem of mediation security that addresses the aforementioned security challenges. A context-aware flexible authorization framework was developed in the dissertation to deal with security challenges faced by mediation system. The authorization framework consists of two major tasks, specifying security policies and enforcing security policies. Firstly, the security policy specification provides a generic and extensible method to model the security policies with respect to the challenges posed by the mediation system. The security policies in this study are specified by 5-tuples followed by a series of authorization constraints, which are identified based on the relationship of the different security components in the mediation system. Two essential features of mediation systems, i. e., relationship among authorization components and interoperability among heterogeneous data sources, are the focus of this investigation. Secondly, this dissertation supports effective access control on mediation systems while providing uniform access for heterogeneous data sources. The dynamic security constraints are handled in the authorization phase instead of the authentication phase, thus the maintenance cost of security specification can be reduced compared with related solutions. ^
Resumo:
Orthogonal Frequency-Division Multiplexing (OFDM) has been proved to be a promising technology that enables the transmission of higher data rate. Multicarrier Code-Division Multiple Access (MC-CDMA) is a transmission technique which combines the advantages of both OFDM and Code-Division Multiplexing Access (CDMA), so as to allow high transmission rates over severe time-dispersive multi-path channels without the need of a complex receiver implementation. Also MC-CDMA exploits frequency diversity via the different subcarriers, and therefore allows the high code rates systems to achieve good Bit Error Rate (BER) performances. Furthermore, the spreading in the frequency domain makes the time synchronization requirement much lower than traditional direct sequence CDMA schemes. There are still some problems when we use MC-CDMA. One is the high Peak-to-Average Power Ratio (PAPR) of the transmit signal. High PAPR leads to nonlinear distortion of the amplifier and results in inter-carrier self-interference plus out-of-band radiation. On the other hand, suppressing the Multiple Access Interference (MAI) is another crucial problem in the MC-CDMA system. Imperfect cross-correlation characteristics of the spreading codes and the multipath fading destroy the orthogonality among the users, and then cause MAI, which produces serious BER degradation in the system. Moreover, in uplink system the received signals at a base station are always asynchronous. This also destroys the orthogonality among the users, and hence, generates MAI which degrades the system performance. Besides those two problems, the interference should always be considered seriously for any communication system. In this dissertation, we design a novel MC-CDMA system, which has low PAPR and mitigated MAI. The new Semi-blind channel estimation and multi-user data detection based on Parallel Interference Cancellation (PIC) have been applied in the system. The Low Density Parity Codes (LDPC) has also been introduced into the system to improve the performance. Different interference models are analyzed in multi-carrier communication systems and then the effective interference suppression for MC-CDMA systems is employed in this dissertation. The experimental results indicate that our system not only significantly reduces the PAPR and MAI but also effectively suppresses the outside interference with low complexity. Finally, we present a practical cognitive application of the proposed system over the software defined radio platform.
Resumo:
Effective interaction with personal computers is a basic requirement for many of the functions that are performed in our daily lives. With the rapid emergence of the Internet and the World Wide Web, computers have become one of the premier means of communication in our society. Unfortunately, these advances have not become equally accessible to physically handicapped individuals. In reality, a significant number of individuals with severe motor disabilities, due to a variety of causes such as Spinal Cord Injury (SCI), Amyothrophic Lateral Sclerosis (ALS), etc., may not be able to utilize the computer mouse as a vital input device for computer interaction. The purpose of this research was to further develop and improve an existing alternative input device for computer cursor control to be used by individuals with severe motor disabilities. This thesis describes the development and the underlying principle for a practical hands-off human-computer interface based on Electromyogram (EMG) signals and Eye Gaze Tracking (EGT) technology compatible with the Microsoft Windows operating system (OS). Results of the software developed in this thesis show a significant improvement in the performance and usability of the EMG/EGT cursor control HCI.