183 resultados para Non Ideal System


Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the advent of large-scale wind farms and their integration into electrical grids, more uncertainties, constraints and objectives must be considered in power system development. It is therefore necessary to introduce risk-control strategies into the planning of transmission systems connected with wind power generators. This paper presents a probability-based multi-objective model equipped with three risk-control strategies. The model is developed to evaluate and enhance the ability of the transmission system to protect against overload risks when wind power is integrated into the power system. The model involves: (i) defining the uncertainties associated with wind power generators with probability measures and calculating the probabilistic power flow with the combined use of cumulants and Gram-Charlier series; (ii) developing three risk-control strategies by specifying the smallest acceptable non-overload probability for each branch and the whole system, and specifying the non-overload margin for all branches in the whole system; (iii) formulating an overload risk index based on the non-overload probability and the non-overload margin defined; and (iv) developing a multi-objective transmission system expansion planning (TSEP) model with the objective functions composed of transmission investment and the overload risk index. The presented work represents a superior risk-control model for TSEP in terms of security, reliability and economy. The transmission expansion planning model with the three risk-control strategies demonstrates its feasibility in the case study using two typical power systems

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A sub optimal resource allocation algorithm for Orthogonal Frequency Division Multiplexing (OFDM) based cooperative scheme is proposed. The system consists of multiple relays. Subcarrier space is divided into blocks and relays participating in cooperation are allocated specific blocks to be used with a user. To ensure unique subcarrier assignment system is constrained such that same block cannot be used by more than one user. Users are given fair block assignments while no restriction for maximum number of blocks a relay can employ is given. Forced cost based decisions [1] are used for block allocation. Simulation results show that this scheme outperforms a non cooperating scheme with sequential allocation with respect to power usage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

• Road crashes as a cause of disability • Disability in the study of road safety • Thai spinal injury study – Contextual information – beliefs and community – Transport system and hidden safety costs – Cambodia experience – Pakistan fatalism study • Feedback to policies and programs

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Exponential growth of genomic data in the last two decades has made manual analyses impractical for all but trial studies. As genomic analyses have become more sophisticated, and move toward comparisons across large datasets, computational approaches have become essential. One of the most important biological questions is to understand the mechanisms underlying gene regulation. Genetic regulation is commonly investigated and modelled through the use of transcriptional regulatory network (TRN) structures. These model the regulatory interactions between two key components: transcription factors (TFs) and the target genes (TGs) they regulate. Transcriptional regulatory networks have proven to be invaluable scientific tools in Bioinformatics. When used in conjunction with comparative genomics, they have provided substantial insights into the evolution of regulatory interactions. Current approaches to regulatory network inference, however, omit two additional key entities: promoters and transcription factor binding sites (TFBSs). In this study, we attempted to explore the relationships among these regulatory components in bacteria. Our primary goal was to identify relationships that can assist in reducing the high false positive rates associated with transcription factor binding site predictions and thereupon enhance the reliability of the inferred transcription regulatory networks. In our preliminary exploration of relationships between the key regulatory components in Escherichia coli transcription, we discovered a number of potentially useful features. The combination of location score and sequence dissimilarity scores increased de novo binding site prediction accuracy by 13.6%. Another important observation made was with regards to the relationship between transcription factors grouped by their regulatory role and corresponding promoter strength. Our study of E.coli ��70 promoters, found support at the 0.1 significance level for our hypothesis | that weak promoters are preferentially associated with activator binding sites to enhance gene expression, whilst strong promoters have more repressor binding sites to repress or inhibit gene transcription. Although the observations were specific to �70, they nevertheless strongly encourage additional investigations when more experimentally confirmed data are available. In our preliminary exploration of relationships between the key regulatory components in E.coli transcription, we discovered a number of potentially useful features { some of which proved successful in reducing the number of false positives when applied to re-evaluate binding site predictions. Of chief interest was the relationship observed between promoter strength and TFs with respect to their regulatory role. Based on the common assumption, where promoter homology positively correlates with transcription rate, we hypothesised that weak promoters would have more transcription factors that enhance gene expression, whilst strong promoters would have more repressor binding sites. The t-tests assessed for E.coli �70 promoters returned a p-value of 0.072, which at 0.1 significance level suggested support for our (alternative) hypothesis; albeit this trend may only be present for promoters where corresponding TFBSs are either all repressors or all activators. Nevertheless, such suggestive results strongly encourage additional investigations when more experimentally confirmed data will become available. Much of the remainder of the thesis concerns a machine learning study of binding site prediction, using the SVM and kernel methods, principally the spectrum kernel. Spectrum kernels have been successfully applied in previous studies of protein classification [91, 92], as well as the related problem of promoter predictions [59], and we have here successfully applied the technique to refining TFBS predictions. The advantages provided by the SVM classifier were best seen in `moderately'-conserved transcription factor binding sites as represented by our E.coli CRP case study. Inclusion of additional position feature attributes further increased accuracy by 9.1% but more notable was the considerable decrease in false positive rate from 0.8 to 0.5 while retaining 0.9 sensitivity. Improved prediction of transcription factor binding sites is in turn extremely valuable in improving inference of regulatory relationships, a problem notoriously prone to false positive predictions. Here, the number of false regulatory interactions inferred using the conventional two-component model was substantially reduced when we integrated de novo transcription factor binding site predictions as an additional criterion for acceptance in a case study of inference in the Fur regulon. This initial work was extended to a comparative study of the iron regulatory system across 20 Yersinia strains. This work revealed interesting, strain-specific difierences, especially between pathogenic and non-pathogenic strains. Such difierences were made clear through interactive visualisations using the TRNDifi software developed as part of this work, and would have remained undetected using conventional methods. This approach led to the nomination of the Yfe iron-uptake system as a candidate for further wet-lab experimentation due to its potential active functionality in non-pathogens and its known participation in full virulence of the bubonic plague strain. Building on this work, we introduced novel structures we have labelled as `regulatory trees', inspired by the phylogenetic tree concept. Instead of using gene or protein sequence similarity, the regulatory trees were constructed based on the number of similar regulatory interactions. While the common phylogentic trees convey information regarding changes in gene repertoire, which we might regard being analogous to `hardware', the regulatory tree informs us of the changes in regulatory circuitry, in some respects analogous to `software'. In this context, we explored the `pan-regulatory network' for the Fur system, the entire set of regulatory interactions found for the Fur transcription factor across a group of genomes. In the pan-regulatory network, emphasis is placed on how the regulatory network for each target genome is inferred from multiple sources instead of a single source, as is the common approach. The benefit of using multiple reference networks, is a more comprehensive survey of the relationships, and increased confidence in the regulatory interactions predicted. In the present study, we distinguish between relationships found across the full set of genomes as the `core-regulatory-set', and interactions found only in a subset of genomes explored as the `sub-regulatory-set'. We found nine Fur target gene clusters present across the four genomes studied, this core set potentially identifying basic regulatory processes essential for survival. Species level difierences are seen at the sub-regulatory-set level; for example the known virulence factors, YbtA and PchR were found in Y.pestis and P.aerguinosa respectively, but were not present in both E.coli and B.subtilis. Such factors and the iron-uptake systems they regulate, are ideal candidates for wet-lab investigation to determine whether or not they are pathogenic specific. In this study, we employed a broad range of approaches to address our goals and assessed these methods using the Fur regulon as our initial case study. We identified a set of promising feature attributes; demonstrated their success in increasing transcription factor binding site prediction specificity while retaining sensitivity, and showed the importance of binding site predictions in enhancing the reliability of regulatory interaction inferences. Most importantly, these outcomes led to the introduction of a range of visualisations and techniques, which are applicable across the entire bacterial spectrum and can be utilised in studies beyond the understanding of transcriptional regulatory networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A frame-rate stereo vision system, based on non-parametric matching metrics, is described. Traditional metrics, such as normalized cross-correlation, are expensive in terms of logic. Non-parametric measures require only simple, parallelizable, functions such as comparators, counters and exclusive-or, and are thus very well suited to implementation in reprogrammable logic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evaluating the validity of formative variables has presented ongoing challenges for researchers. In this paper we use global criterion measures to compare and critically evaluate two alternative formative measures of System Quality. One model is based on the ISO-9126 software quality standard, and the other is based on a leading information systems research model. We find that despite both models having a strong provenance, many of the items appear to be non-significant in our study. We examine the implications of this by evaluating the quality of the criterion variables we used, and the performance of PLS when evaluating formative models with a large number of items. We find that our respondents had difficulty distinguishing between global criterion variables measuring different aspects of overall System Quality. Also, because formative indicators “compete with one another” in PLS, it may be difficult to develop a set of measures which are all significant for a complex formative construct with a broad scope and a large number of items. Overall, we suggest that there is cautious evidence that both sets of measures are valid and largely equivalent, although questions still remain about the measures, the use of criterion variables, and the use of PLS for this type of model evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Here mixed convection boundary layer flow of a viscous fluid along a heated vertical semi-infinite plate is investigated in a non-absorbing medium. The relationship between convection and thermal radiation is established via boundary condition of second kind on the thermally radiating vertical surface. The governing boundary layer equations are transformed into dimensionless parabolic partial differential equations with the help of appropriate transformations and the resultant system is solved numerically by applying straightforward finite difference method along with Gaussian elimination technique. It is worthy to note that Prandlt number, Pr, is taken to be small (<< 1) which is appropriate for liquid metals. Moreover, the numerical results are demonstrated graphically by showing the effects of important physical parameters, namely, the modified Richardson number (or mixed convection parameter), Ri*, and surface radiation parameter, R, in terms of local skin friction and local Nusselt number coefficients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Internet services are important part of daily activities for most of us. These services come with sophisticated authentication requirements which may not be handled by average Internet users. The management of secure passwords for example creates an extra overhead which is often neglected due to usability reasons. Furthermore, password-based approaches are applicable only for initial logins and do not protect against unlocked workstation attacks. In this paper, we provide a non-intrusive identity verification scheme based on behavior biometrics where keystroke dynamics based-on free-text is used continuously for verifying the identity of a user in real-time. We improved existing keystroke dynamics based verification schemes in four aspects. First, we improve the scalability where we use a constant number of users instead of whole user space to verify the identity of target user. Second, we provide an adaptive user model which enables our solution to take the change of user behavior into consideration in verification decision. Next, we identify a new distance measure which enables us to verify identity of a user with shorter text. Fourth, we decrease the number of false results. Our solution is evaluated on a data set which we have collected from users while they were interacting with their mail-boxes during their daily activities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are different ways to authenticate humans, which is an essential prerequisite for access control. The authentication process can be subdivided into three categories that rely on something someone i) knows (e.g. password), and/or ii) has (e.g. smart card), and/or iii) is (biometric features). Besides classical attacks on password solutions and the risk that identity-related objects can be stolen, traditional biometric solutions have their own disadvantages such as the requirement of expensive devices, risk of stolen bio-templates etc. Moreover, existing approaches provide the authentication process usually performed only once initially. Non-intrusive and continuous monitoring of user activities emerges as promising solution in hardening authentication process: iii-2) how so. behaves. In recent years various keystroke dynamic behavior-based approaches were published that are able to authenticate humans based on their typing behavior. The majority focuses on so-called static text approaches, where users are requested to type a previously defined text. Relatively few techniques are based on free text approaches that allow a transparent monitoring of user activities and provide continuous verification. Unfortunately only few solutions are deployable in application environments under realistic conditions. Unsolved problems are for instance scalability problems, high response times and error rates. The aim of this work is the development of behavioral-based verification solutions. Our main requirement is to deploy these solutions under realistic conditions within existing environments in order to enable a transparent and free text based continuous verification of active users with low error rates and response times.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Past work has clearly demonstrated that numerous commonly used metallic materials will support burning in oxygen, especially at higher pressures. An approach to rectify this significant safety problem has been successfully developed and implemented by applying the concept of Situational Non-Flammability. This approach essentially removes or breaks one leg of the conceptual fire triangle, a tool commonly used to define the three things that are required to support burning; a fuel, an ignition source and an oxidizer. Since an oxidiser is always present in an oxygen system as are ignition sources, the concept of Situational Non-Flammability essentially removes the fuel leg of the fire triangle by only utilising materials that will not burn at the maximum pressure, for example, that the control valve is to be used in. The utilisation of this approach has lead to the development of a range of oxygen components that are practically unable to burn while in service at their design pressure thus providing an unparalleled level of first safety while not compromising on the performance or endurance required in the function of these components. This paper describes the concept of Situational Non-Flammability, how it was used to theoretically evaluate designs of components for oxygen service and the outcomes of the actual development, fabrication and finally utilisation of these components in real oxygen systems in a range of flow control devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The overrepresentation of students from minority ethnic groups in separate special education settings has been extensively documented in North America, yet little research exists for Australian school systems. To address this gap, we systematically analyzed 13 years of enrolment data from the state of New South Wales. Stark differences are seen in patterns of enrolment between Indigenous students, students from a Language Background Other than English (LBOTE), and non-Indigenous English speaking students. Moreover, these differences are increasing. While enrollments of Indigenous students in separate settings increased faster across time than did enrollments of Indigenous students in mainstream, enrollments of LBOTE students in mainstream increased faster than did enrollments of LBOTE students in separate settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Trajectory basis Non-Rigid Structure From Motion (NRSFM) currently faces two problems: the limit of reconstructability and the need to tune the basis size for different sequences. This paper provides a novel theoretical bound on 3D reconstruction error, arguing that the existing definition of reconstructability is fundamentally flawed in that it fails to consider system condition. This insight motivates a novel strategy whereby the trajectory's response to a set of high-pass filters is minimised. The new approach eliminates the need to tune the basis size and is more efficient for long sequences. Additionally, the truncated DCT basis is shown to have a dual interpretation as a high-pass filter. The success of trajectory filter reconstruction is demonstrated quantitatively on synthetic projections of real motion capture sequences and qualitatively on real image sequences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aerial Vehicles (UAV) has become a significant growing segment of the global aviation industry. These vehicles are developed with the intention of operating in regions where the presence of onboard human pilots is either too risky or unnecessary. Their popularity with both the military and civilian sectors have seen the use of UAVs in a diverse range of applications, from reconnaissance and surveillance tasks for the military, to civilian uses such as aid relief and monitoring tasks. Efficient energy utilisation on an UAV is essential to its functioning, often to achieve the operational goals of range, endurance and other specific mission requirements. Due to the limitations of the space available and the mass budget on the UAV, it is often a delicate balance between the onboard energy available (i.e. fuel) and achieving the operational goals. This paper presents the development of a parallel Hybrid Electric Propulsion System (HEPS) on a small fixed-wing UAV incorporating an Ideal Operating Line (IOL) control strategy. A simulation model of an UAV was developed in the MATLAB Simulink environment, utilising the AeroSim Blockset and the in-built Aerosonde UAV block and its parameters. An IOL analysis of an Aerosonde engine was performed, and the most efficient (i.e. provides greatest torque output at the least fuel consumption) points of operation for this engine were determined. Simulation models of the components in a HEPS were designed and constructed in the MATLAB Simulink environment. It was demonstrated through simulation that an UAV with the current HEPS configuration was capable of achieving a fuel saving of 6.5%, compared to the ICE-only configuration. These components form the basis for the development of a complete simulation model of a Hybrid-Electric UAV (HEUAV).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Small interfering RNA silences specific genes by interfering with mRNA translation, and acts to modulate or inhibit specific biological pathways; a therapy that holds great promise in the cure of many diseases. However, the naked small interfering RNA is susceptible to degradation by plasma and tissue nucleases and due to its negative charge unable to cross the cell membrane. Here we report a new polymer carrier designed to mimic the influenza virus escape mechanism from the endosome, followed by a timed release of the small interfering RNA in the cytosol through a self-catalyzed polymer degradation process. Our polymer changes to a negatively charged and non-toxic polymer after the release of small interfering RNA, presenting potential for multiple repeat doses and long-term treatment of diseases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a new active noise control (ANC) technique. The technique has a feedback structure to have a simple configuration in practical implementation. In this approach, the secondary path is modelled online to ensure convergence of the system as the secondary paths are practically time varying or non-linear. The proposed method consists of two steps: a noise controller which is based on a modified FxLMS algorithm, and a new variable step size (VSS) LMS algorithm which is used to adapt the modelling filter with the secondary path. The proposed algorithm stops injection of the white noise at the optimum point and reactivate the injection during the operation, if needed, to maintain performance of the system. Eliminating continuous injection of the white noise increases the performance of the proposed method significantly and makes it more desirable for practical ANC systems. The computer simulations are presented to show the effectiveness of the proposed method.