73 resultados para two stage least square
Resumo:
The design of a two-stage differential cascode power amplifier (PA) for 81-86 GHz E-band applications is presented. The PA was realised in SiGe technology with fT/fmax 170/250 GHz. A broadband transformer with efficiency higher than 79.4% from 71 GHz to 96 GHz is used as a BALUN. The PA delivers a 4.5 dBm saturated output power and exhibits a 13.4 dB gain at 83.6 GHz. The input and output return losses agree well with the design specifications.
Resumo:
This piece highlights and offers a brief analysis of the most important of the
proposed changes to Polish competition law. The draft proposal envisages introduction of, inter alia, financial penalties for individuals, two-stage merger review process, important changes to the leniency program (including introduction of leniency plus), as well as such new tools as remedies and settlements.
Resumo:
Features analysis is an important task which can significantly affect the performance of automatic bacteria colony picking. Unstructured environments also affect the automatic colony screening. This paper presents a novel approach for adaptive colony segmentation in unstructured environments by treating the detected peaks of intensity histograms as a morphological feature of images. In order to avoid disturbing peaks, an entropy based mean shift filter is introduced to smooth images as a preprocessing step. The relevance and importance of these features can be determined in an improved support vector machine classifier using unascertained least square estimation. Experimental results show that the proposed unascertained least square support vector machine (ULSSVM) has better recognition accuracy than the other state-of-the-art techniques, and its training process takes less time than most of the traditional approaches presented in this paper.
Resumo:
In this paper, the impact of multiple active eavesdroppers on cooperative single carrier systems with multiple relays and multiple destinations is examined. To achieve the secrecy diversity gains in the form of opportunistic selection, a two-stage scheme is proposed for joint relay and destination selection, in which, after the selection of the relay with the minimum effective maximum signal-to-noise ratio (SNR) to a cluster of eavesdroppers, the destination that has the maximum SNR from the chosen relay is selected. In order to accurately assess the secrecy performance, the exact and asymptotic expressions are obtained in closed-form for several security metrics including the secrecy outage probability, the probability of non-zero secrecy rate, and the ergodic secrecy rate in frequency selective fading. Based on the asymptotic analysis, key design parameters such as secrecy diversity gain, secrecy array gain, secrecy multiplexing gain, and power cost are characterized, from which new insights are drawn. Moreover, it is concluded that secrecy performance limits occur when the average received power at the eavesdropper is proportional to the counterpart at the destination. Specifically, for the secrecy outage probability, it is confirmed that the secrecy diversity gain collapses to zero with outage floor, whereas for the ergodic secrecy rate, it is confirmed confirm that its slope collapses to zero with capacity ceiling.
Resumo:
Polymer extrusion is regarded as an energy-intensive production process, and the real-time monitoring of both energy consumption and melt quality has become necessary to meet new carbon regulations and survive in the highly competitive plastics market. The use of a power meter is a simple and easy way to monitor energy, but the cost can sometimes be high. On the other hand, viscosity is regarded as one of the key indicators of melt quality in the polymer extrusion process. Unfortunately, viscosity cannot be measured directly using current sensory technology. The employment of on-line, in-line or off-line rheometers is sometimes useful, but these instruments either involve signal delay or cause flow restrictions to the extrusion process, which is obviously not suitable for real-time monitoring and control in practice. In this paper, simple and accurate real-time energy monitoring methods are developed. This is achieved by looking inside the controller, and using control variables to calculate the power consumption. For viscosity monitoring, a ‘soft-sensor’ approach based on an RBF neural network model is developed. The model is obtained through a two-stage selection and differential evolution, enabling compact and accurate solutions for viscosity monitoring. The proposed monitoring methods were tested and validated on a Killion KTS-100 extruder, and the experimental results show high accuracy compared with traditional monitoring approaches.
Resumo:
In the production process of polyethylene terephthalate (PET) bottles, the initial temperature of preforms plays a central role on the final thickness, intensity and other structural properties of the bottles. Also, the difference between inside and outside temperature profiles could make a significant impact on the final product quality. The preforms are preheated by infrared heating oven system which is often an open loop system and relies heavily on trial and error approach to adjust the lamp power settings. In this paper, a radial basis function (RBF) neural network model, optimized by a two-stage selection (TSS) algorithm combined with partial swarm optimization (PSO), is developed to model the nonlinear relations between the lamp power settings and the output temperature profile of PET bottles. Then an improved PSO method for lamp setting adjustment using the above model is presented. Simulation results based on experimental data confirm the effectiveness of the modelling and optimization method.
Resumo:
We analyze a two-stage quantity setting oligopolistic price discrimination game. In the first stage firms choose capacities and in the second stage they simultaneously choose the share that they assign to each segment. At the equilibrium the firms focus more on the high-valuation customers. When the capacities in the first stage are endogenous, the deadweight loss does not vanish with the level of price discrimination, as it does in one-stage games and monopoly. Moreover, the quantity-weighted average price increases with the level of price discrimination as opposed to established results in the literature for one-stage games.
Resumo:
One of the major challenges in systems biology is to understand the complex responses of a biological system to external perturbations or internal signalling depending on its biological conditions. Genome-wide transcriptomic profiling of cellular systems under various chemical perturbations allows the manifestation of certain features of the chemicals through their transcriptomic expression profiles. The insights obtained may help to establish the connections between human diseases, associated genes and therapeutic drugs. The main objective of this study was to systematically analyse cellular gene expression data under various drug treatments to elucidate drug-feature specific transcriptomic signatures. We first extracted drug-related information (drug features) from the collected textual description of DrugBank entries using text-mining techniques. A novel statistical method employing orthogonal least square learning was proposed to obtain drug-feature-specific signatures by integrating gene expression with DrugBank data. To obtain robust signatures from noisy input datasets, a stringent ensemble approach was applied with the combination of three techniques: resampling, leave-one-out cross validation, and aggregation. The validation experiments showed that the proposed method has the capacity of extracting biologically meaningful drug-feature-specific gene expression signatures. It was also shown that most of signature genes are connected with common hub genes by regulatory network analysis. The common hub genes were further shown to be related to general drug metabolism by Gene Ontology analysis. Each set of genes has relatively few interactions with other sets, indicating the modular nature of each signature and its drug-feature-specificity. Based on Gene Ontology analysis, we also found that each set of drug feature (DF)-specific genes were indeed enriched in biological processes related to the drug feature. The results of these experiments demonstrated the pot- ntial of the method for predicting certain features of new drugs using their transcriptomic profiles, providing a useful methodological framework and a valuable resource for drug development and characterization.
Resumo:
his paper considers a problem of identification for a high dimensional nonlinear non-parametric system when only a limited data set is available. The algorithms are proposed for this purpose which exploit the relationship between the input variables and the output and further the inter-dependence of input variables so that the importance of the input variables can be established. A key to these algorithms is the non-parametric two stage input selection algorithm.
Resumo:
Realistic Evaluation of EWS and ALERT: factors enabling and constraining implementation Background The implementation of EWS and ALERT in practice is essential to the success of Rapid Response Systems but is dependent upon nurses utilising EWS protocols and applying ALERT best practice guidelines. To date there is limited evidence on the effectiveness of EWS or ALERT as research has primarily focused on measuring patient outcomes (cardiac arrests, ICU admissions) following the implementation of a Rapid Response Team. Complex interventions in healthcare aimed at changing service delivery and related behaviour of health professionals require a different research approach to evaluate the evidence. To understand how and why EWS and ALERT work, or might not work, research needs to consider the social, cultural and organisational influences that will impact on successful implementation in practice. This requires a research approach that considers both the processes and outcomes of complex interventions, such as EWS and ALERT, implemented in practice. Realistic Evaluation is such an approach and was used to explain the factors that enable and constrain the implementation of EWS and ALERT in practice [1]. Aim The aim of this study was to evaluate factors that enabled and constrained the implementation and service delivery of early warnings systems (EWS) and ALERT in practice in order to provide direction for enabling their success and sustainability. Methods The research design was a multiple case study approach of four wards in two hospitals in Northern Ireland. It followed the principles of realist evaluation research which allowed empirical data to be gathered to test and refine RRS programme theory. This approach used a variety of mixed methods to test the programme theories including individual and focus group interviews, observation and documentary analysis in a two stage process. A purposive sample of 75 key informants participated in individual and focus group interviews. Observation and documentary analysis of EWS compliance data and ALERT training records provided further evidence to support or refute the interview findings. Data was analysed using NVIVO8 to categorise interview findings and SPSS for ALERT documentary data. These findings were further synthesised by undertaking a within and cross case comparison to explain the factors enabling and constraining EWS and ALERT. Results A cross case analysis highlighted similarities, differences and factors enabling or constraining successful implementation across the case study sites. Findings showed that personal (confidence; clinical judgement; personality), social (ward leadership; communication), organisational (workload and staffing issues; pressure from managers to complete EWS audit and targets), educational (constraints on training; no clinical educator on ward) and cultural (routine task delegated) influences impact on EWS and acute care training outcomes. There were also differences noted between medical and surgical wards across both case sites. Conclusions Realist Evaluation allows refinement and development of the RRS programme theory to explain the realities of practice. These refined RRS programme theories are capable of informing the planning of future service provision and provide direction for enabling their success and sustainability. References: 1. McGaughey J, Blackwood B, O’Halloran P, Trinder T. J. & Porter S. (2010) A realistic evaluation of Track and Trigger systems and acute care training for early recognition and management of deteriorating ward–based patients. Journal of Advanced Nursing 66 (4), 923-932. Type of submission: Concurrent session Source of funding: Sandra Ryan Fellowship funded by the School of Nursing & Midwifery, Queen’s University of Belfast
Resumo:
This research presents a fast algorithm for projected support vector machines (PSVM) by selecting a basis vector set (BVS) for the kernel-induced feature space, the training points are projected onto the subspace spanned by the selected BVS. A standard linear support vector machine (SVM) is then produced in the subspace with the projected training points. As the dimension of the subspace is determined by the size of the selected basis vector set, the size of the produced SVM expansion can be specified. A two-stage algorithm is derived which selects and refines the basis vector set achieving a locally optimal model. The model expansion coefficients and bias are updated recursively for increase and decrease in the basis set and support vector set. The condition for a point to be classed as outside the current basis vector and selected as a new basis vector is derived and embedded in the recursive procedure. This guarantees the linear independence of the produced basis set. The proposed algorithm is tested and compared with an existing sparse primal SVM (SpSVM) and a standard SVM (LibSVM) on seven public benchmark classification problems. Our new algorithm is designed for use in the application area of human activity recognition using smart devices and embedded sensors where their sometimes limited memory and processing resources must be exploited to the full and the more robust and accurate the classification the more satisfied the user. Experimental results demonstrate the effectiveness and efficiency of the proposed algorithm. This work builds upon a previously published algorithm specifically created for activity recognition within mobile applications for the EU Haptimap project [1]. The algorithms detailed in this paper are more memory and resource efficient making them suitable for use with bigger data sets and more easily trained SVMs.
Resumo:
This paper considers the antecedents and outcomes of downstream environmental logistics practices within green supply chain management amongst a sample of respondents based in the UK food industry. Framed through the conceptual lens of the natural resource-based view (NRBV) this research specifically considers (i) whether environmentally proactive companies implement environmental practices downstream in their supply chains as an extension of internal environmental practices and (ii) whether such downstream environmental practices influence performance, particularly when there has been engagement with key stakeholders in their implementation. The paper begins by developing a theoretical model grounded in the NRBV. This model and associated hypotheses are tested using Multivariate Ordinary Least Square (OLS) regression analysis using data from a sample of 149 firms within the UK food industry. The results provide support for a number of the assumptions implicit in the NRBV confirming the link between environmental proactivity and downstream environmental logistics and the important role of internal environmental practices in facilitating this link. The findings also support a direct link between downstream environmental logistics and both environmental and cost performance, which may be enhanced in the presence of high levels of environmental engagement with customers.
Resumo:
Background
Behaviour problems are common in young children with autism spectrum disorder (ASD). There are many different tools used to measure behavior problems but little is known about their validity for the population.
Objectives
To evaluate the measurement properties of behaviour problems tools used in evaluation of intervention or observational research studies with children with ASD up to the age of six years.
Methods
Behaviour measurement tools were identified as part of a larger, two stage, systematic review. First, sixteen major electronic databases, as well as grey literature and research registers were searched, and tools used listed and categorized. Second, using methodological filters, we searched for articles examining the measurement properties of the tools in use with young children with ASD in ERIC, MEDLINE, EMBASE, CINAHL, and PsycINFO. The quality of these papers was then evaluated using the COSMIN checklist.
Results
We identified twelve tools which had been used to measure behaviour problems in young children with ASD, and fifteen studies which investigated the measurement properties of six of these tools. There was no evidence available for the remaining six tools. Two questionnaires were found to be the most robust in their measurement properties, the Child Behavior Checklist and the Home Situations Questionnaire—Pervasive Developmental Disorders version.
Conclusions
We found patchy evidence on reliability and validity, for only a few of the tools used to measure behaviour problems in young children with ASD. More systematic research is required on measurement properties of tools for use in this population, in particular to establish responsiveness to change which is essential in measurement of outcomes of intervention.
PROSPERO Registration Number
CRD42012002223