866 resultados para standard vector control scheme


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Scholars have long debated whether ownership structure matters for firm performance. The standard view with respect to Victorian Britain is that family-controlled companies had a detrimental effect on operating profit and shareholder value. Here, we examine this view using a hand-collected corporate ownership dataset. Our main finding is that it was not necessarily the broad structure of corporate ownership that mattered for performance, but whether family blockholders had a governance role. Large active blockholders tended to increase operating performance, implying that they reduced managerial agency problems. In contrast, we find that directors who were independent of large family owners were more likely to increase shareholder value.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research presents a fast algorithm for projected support vector machines (PSVM) by selecting a basis vector set (BVS) for the kernel-induced feature space, the training points are projected onto the subspace spanned by the selected BVS. A standard linear support vector machine (SVM) is then produced in the subspace with the projected training points. As the dimension of the subspace is determined by the size of the selected basis vector set, the size of the produced SVM expansion can be specified. A two-stage algorithm is derived which selects and refines the basis vector set achieving a locally optimal model. The model expansion coefficients and bias are updated recursively for increase and decrease in the basis set and support vector set. The condition for a point to be classed as outside the current basis vector and selected as a new basis vector is derived and embedded in the recursive procedure. This guarantees the linear independence of the produced basis set. The proposed algorithm is tested and compared with an existing sparse primal SVM (SpSVM) and a standard SVM (LibSVM) on seven public benchmark classification problems. Our new algorithm is designed for use in the application area of human activity recognition using smart devices and embedded sensors where their sometimes limited memory and processing resources must be exploited to the full and the more robust and accurate the classification the more satisfied the user. Experimental results demonstrate the effectiveness and efficiency of the proposed algorithm. This work builds upon a previously published algorithm specifically created for activity recognition within mobile applications for the EU Haptimap project [1]. The algorithms detailed in this paper are more memory and resource efficient making them suitable for use with bigger data sets and more easily trained SVMs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Importance: The natural history of patients with newly diagnosed high-risk nonmetastatic (M0) prostate cancer receiving hormone therapy (HT) either alone or with standard-of-care radiotherapy (RT) is not well documented. Furthermore, no clinical trial has assessed the role of RT in patients with node-positive (N+) M0 disease. The STAMPEDE Trial includes such individuals, allowing an exploratory multivariate analysis of the impact of radical RT.

Objective: To describe survival and the impact on failure-free survival of RT by nodal involvement in these patients.

Design, Setting, and Participants: Cohort study using data collected for patients allocated to the control arm (standard-of-care only) of the STAMPEDE Trial between October 5, 2005, and May 1, 2014. Outcomes are presented as hazard ratios (HRs) with 95% CIs derived from adjusted Cox models; survival estimates are reported at 2 and 5 years. Participants were high-risk, hormone-naive patients with newly diagnosed M0 prostate cancer starting long-term HT for the first time. Radiotherapy is encouraged in this group, but mandated for patients with node-negative (N0) M0 disease only since November 2011.

Exposures: Long-term HT either alone or with RT, as per local standard. Planned RT use was recorded at entry.

Main Outcomes and Measures: Failure-free survival (FFS) and overall survival.

Results: A total of 721 men with newly diagnosed M0 disease were included: median age at entry, 66 (interquartile range [IQR], 61-72) years, median (IQR) prostate-specific antigen level of 43 (18-88) ng/mL. There were 40 deaths (31 owing to prostate cancer) with 17 months' median follow-up. Two-year survival was 96% (95% CI, 93%-97%) and 2-year FFS, 77% (95% CI, 73%-81%). Median (IQR) FFS was 63 (26 to not reached) months. Time to FFS was worse in patients with N+ disease (HR, 2.02 [95% CI, 1.46-2.81]) than in those with N0 disease. Failure-free survival outcomes favored planned use of RT for patients with both N0M0 (HR, 0.33 [95% CI, 0.18-0.61]) and N+M0 disease (HR, 0.48 [95% CI, 0.29-0.79]).

Conclusions and Relevance: Survival for men entering the cohort with high-risk M0 disease was higher than anticipated at study inception. These nonrandomized data were consistent with previous trials that support routine use of RT with HT in patients with N0M0 disease. Additionally, the data suggest that the benefits of RT extend to men with N+M0 disease.

Trial Registration: clinicaltrials.gov Identifier: NCT00268476; ISRCTN78818544.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wearable devices performing advanced bio-signal analysis algorithms are aimed to foster a revolution in healthcare provision of chronic cardiac diseases. In this context, energy efficiency is of paramount importance, as long-term monitoring must be ensured while relying on a tiny power source. Operating at a scaled supply voltage, just above the threshold voltage, effectively helps in saving substantial energy, but it makes circuits, and especially memories, more prone to errors, threatening the correct execution of algorithms. The use of error detection and correction codes may help to protect the entire memory content, however it incurs in large area and energy overheads which may not be compatible with the tight energy budgets of wearable systems. To cope with this challenge, in this paper we propose to limit the overhead of traditional schemes by selectively detecting and correcting errors only in data highly impacting the end-to-end quality of service of ultra-low power wearable electrocardiogram (ECG) devices. This partition adopts the protection of either significant words or significant bits of each data element, according to the application characteristics (statistical properties of the data in the application buffers), and its impact in determining the output. The proposed heterogeneous error protection scheme in real ECG signals allows substantial energy savings (11% in wearable devices) compared to state-of-the-art approaches, like ECC, in which the whole memory is protected against errors. At the same time, it also results in negligible output quality degradation in the evaluated power spectrum analysis application of ECG signals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lattice-based cryptography has gained credence recently as a replacement for current public-key cryptosystems, due to its quantum-resilience, versatility, and relatively low key sizes. To date, encryption based on the learning with errors (LWE) problem has only been investigated from an ideal lattice standpoint, due to its computation and size efficiencies. However, a thorough investigation of standard lattices in practice has yet to be considered. Standard lattices may be preferred to ideal lattices due to their stronger security assumptions and less restrictive parameter selection process. In this paper, an area-optimised hardware architecture of a standard lattice-based cryptographic scheme is proposed. The design is implemented on a FPGA and it is found that both encryption and decryption fit comfortably on a Spartan-6 FPGA. This is the first hardware architecture for standard lattice-based cryptography reported in the literature to date, and thus is a benchmark for future implementations.
Additionally, a revised discrete Gaussian sampler is proposed which is the fastest of its type to date, and also is the first to investigate the cost savings of implementing with lamda_2-bits of precision. Performance results are promising in comparison to the hardware designs of the equivalent ring-LWE scheme, which in addition to providing a stronger security proof; generate 1272 encryptions per second and 4395 decryptions per second.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION AND GOALS: Genus Bursaphelenchus includes several pests of the world importance for the rural economy, the most dangerous are the Bursaphelenchus xylophilus (the pinewood nematode caused decline of the pine trees in south Asia and in one spot area in Europe, Portugal, Peninsula de Setubal) and the Bursaphelenchus cocophilus, causing the decline of coco-palm plantations in Carribean and Latin American regions. The peculiarity of the host-parasite association of the genus that the nematode life cycle includes three trophic components: plant (mostly a tree), insect vector and a fungus. Goals of the presentation is to list all species of the world fauna and all efficient diagnostic characters, then create the identification tool and analyze the similarity of species and possible ways and causes of the host-parasite evolution of the group. RESULTS: Complete list of species with synonymy and a catalogue of all efficient diagnostic characters with their states, selected from papers of the most experienced taxonomists of the genus, are given for the genus Bursaphelenchus. List of known records of Bursaphelenchus species with names of natural vectors and plants and their families is given (for world pests the most important groups of trees and insects are listed). The tabular, traditional and computer-aided keys are presented. Dendrograms of species relationships (UPGMA, standard distance: mean character difference) based on all efficient taxonomic characters and separately on the spicule characters only, are given. Discussion whether the species groups are natural or purely diagnostic ones is based on the relationships dendrograms and the vector and associated plant ranges of Bursaphelenchus species; the xylophilus species group (B. xylophilus, B. abruptus, B. baujardi, B. conicaudatus, B. eroshenkii, B. fraudulentus, B. kolymensis, B. luxuriosae; B. mucronatus), the hunti group (B. hunti, B. seani, B. kevini and B. fungivorus) are probably the natural ones. CONCLUSIONS: The parasitic nematode association includes three trophic components: plant, insect vector and fungus. The initial insect-plant complex Scolytidae-Pinaceae is changeable and only in rare occasions the change of the preferred vector to Cerambycidae (the xylophilus group), Hymenoptera (the hunti group) led to formation of the natural species-groups. From the analysis it is clear that although the vector range is changeable it is comparatively more important for the evolution of the genus Bursaphelenchus than associations with plants at the family level. Data on the fungi species (3rd component in natural Bursaphelenchus associations) are insufficient for the detailed comparative analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consideramos o problema de controlo óptimo de tempo mínimo para sistemas de controlo mono-entrada e controlo afim num espaço de dimensão finita com condições inicial e final fixas, onde o controlo escalar toma valores num intervalo fechado. Quando aplicamos o método de tiro a este problema, vários obstáculos podem surgir uma vez que a função de tiro não é diferenciável quando o controlo é bang-bang. No caso bang-bang os tempos conjugados são teoricamente bem definidos para este tipo de sistemas de controlo, contudo os algoritmos computacionais directos disponíveis são de difícil aplicação. Por outro lado, no caso suave o conceito teórico e prático de tempos conjugados é bem conhecido, e ferramentas computacionais eficazes estão disponíveis. Propomos um procedimento de regularização para o qual as soluções do problema de tempo mínimo correspondente dependem de um parâmetro real positivo suficientemente pequeno e são definidas por funções suaves em relação à variável tempo, facilitando a aplicação do método de tiro simples. Provamos, sob hipóteses convenientes, a convergência forte das soluções do problema regularizado para a solução do problema inicial, quando o parâmetro real tende para zero. A determinação de tempos conjugados das trajectórias localmente óptimas do problema regularizado enquadra-se na teoria suave conhecida. Provamos, sob hipóteses adequadas, a convergência do primeiro tempo conjugado do problema regularizado para o primeiro tempo conjugado do problema inicial bang-bang, quando o parâmetro real tende para zero. Consequentemente, obtemos um algoritmo eficiente para a computação de tempos conjugados no caso bang-bang.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A relação entre a epidemiologia, a modelação matemática e as ferramentas computacionais permite construir e testar teorias sobre o desenvolvimento e combate de uma doença. Esta tese tem como motivação o estudo de modelos epidemiológicos aplicados a doenças infeciosas numa perspetiva de Controlo Ótimo, dando particular relevância ao Dengue. Sendo uma doença tropical e subtropical transmitida por mosquitos, afecta cerca de 100 milhões de pessoas por ano, e é considerada pela Organização Mundial de Saúde como uma grande preocupação para a saúde pública. Os modelos matemáticos desenvolvidos e testados neste trabalho, baseiam-se em equações diferenciais ordinárias que descrevem a dinâmica subjacente à doença nomeadamente a interação entre humanos e mosquitos. É feito um estudo analítico dos mesmos relativamente aos pontos de equilíbrio, sua estabilidade e número básico de reprodução. A propagação do Dengue pode ser atenuada através de medidas de controlo do vetor transmissor, tais como o uso de inseticidas específicos e campanhas educacionais. Como o desenvolvimento de uma potencial vacina tem sido uma aposta mundial recente, são propostos modelos baseados na simulação de um hipotético processo de vacinação numa população. Tendo por base a teoria de Controlo Ótimo, são analisadas as estratégias ótimas para o uso destes controlos e respetivas repercussões na redução/erradicação da doença aquando de um surto na população, considerando uma abordagem bioeconómica. Os problemas formulados são resolvidos numericamente usando métodos diretos e indiretos. Os primeiros discretizam o problema reformulando-o num problema de optimização não linear. Os métodos indiretos usam o Princípio do Máximo de Pontryagin como condição necessária para encontrar a curva ótima para o respetivo controlo. Nestas duas estratégias utilizam-se vários pacotes de software numérico. Ao longo deste trabalho, houve sempre um compromisso entre o realismo dos modelos epidemiológicos e a sua tratabilidade em termos matemáticos.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Access control is a software engineering challenge in database applications. Currently, there is no satisfactory solution to dynamically implement evolving fine-grained access control mechanisms (FGACM) on business tiers of relational database applications. To tackle this access control gap, we propose an architecture, herein referred to as Dynamic Access Control Architecture (DACA). DACA allows FGACM to be dynamically built and updated at runtime in accordance with the established fine-grained access control policies (FGACP). DACA explores and makes use of Call Level Interfaces (CLI) features to implement FGACM on business tiers. Among the features, we emphasize their performance and their multiple access modes to data residing on relational databases. The different access modes of CLI are wrapped by typed objects driven by FGACM, which are built and updated at runtime. Programmers prescind of traditional access modes of CLI and start using the ones dynamically implemented and updated. DACA comprises three main components: Policy Server (repository of metadata for FGACM), Dynamic Access Control Component (DACC) (business tier component responsible for implementing FGACM) and Policy Manager (broker between DACC and Policy Server). Unlike current approaches, DACA is not dependent on any particular access control model or on any access control policy, this way promoting its applicability to a wide range of different situations. In order to validate DACA, a solution based on Java, Java Database Connectivity (JDBC) and SQL Server was devised and implemented. Two evaluations were carried out. The first one evaluates DACA capability to implement and update FGACM dynamically, at runtime, and, the second one assesses DACA performance against a standard use of JDBC without any FGACM. The collected results show that DACA is an effective approach for implementing evolving FGACM on business tiers based on Call Level Interfaces, in this case JDBC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper a parallel implementation of an Adaprtive Generalized Predictive Control (AGPC) algorithm is presented. Since the AGPC algorithm needs to be fed with knowledge of the plant transfer function, the parallelization of a standard Recursive Least Squares (RLS) estimator and a GPC predictor is discussed here.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper a parallel implementation of an Adaprtive Generalized Predictive Control (AGPC) algorithm is presented. Since the AGPC algorithm needs to be fed with knowledge of the plant transfer function, the parallelization of a standard Recursive Least Squares (RLS) estimator and a GPC predictor is discussed here.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper a parallel implementation of an Adaprtive Generalized Predictive Control (AGPC) algorithm is presented. Since the AGPC algorithm needs to be fed with knowledge of the plant transfer function, the parallelization of a standard Recursive Least Squares (RLS) estimator and a GPC predictor is discussed here.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Adaptive Generalized Predictive Control (AGPC) algorithm can be speeded up using parallel processing. Since the AGPC algorithm needs to be fed with the knowledge of the plant transfer function, the parallelization of a standard Recursive Least Squares (RLS) estimator and a GPC predictor is discussed here.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The speed control system for a concept for cost effective drives with high precision is presented. The drive concept consists of two parallel working drives. The concept is an alternative to direct drives. One big advantage is the use of standard gear boxes with economical components. This paper deals with the control of the drive system consisting of two parts: one drive produces the power for the machine, another drive makes the motion precice and dynamic. Both drives are combined to one double drive by a control system. The drive system is usefull for printing machines and other machines with high power consumption at a nearly constant speed and high accuracy requirements. The calculation for a drive system with 37 kW shows, that the control drive has to supply only about 20 % of the total torque and power needed to compensate the errors of the power drive. The stability of the system is shown by a simulation of the double drive.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The speed control system for a concept for cost effective drives with high precision is presented. The drive concept consists of two parallel working drives. The concept is an alternative to direct drives. One big advantage is the use of standard gear boxes with economical components. This paper deals with the control of the drive system consisting of two parts: one drive produces the power for the machine, another drive makes the motion precice and dynamic. Both drives are combined to one double drive by a control system. The drive system is usefull for printing machines and other machines with high power consumption at a nearly constant speed and high accuracy requirements. The calculation for a drive system with 37 kW shows, that the control drive has to supply only about 20 % of the total torque and power needed to compensate the errors of the power drive. The stability of the system is shown by a simulation of the double drive.