905 resultados para enduser programming, component, messagebased
Resumo:
Un dels instruments més utilitzats per a mesurar la funció de l’espatlla és el conegut com a Constant Score on una quarta part del valor total de la valoració funcional de l’espatlla correspon a la força d’aquesta regió anatòmica. En el disseny original del Constant Score no es tingueren en compte les possibles variacions dels valors normals de força en funció de l’edat o del sexe. Aquesta variació no ha estat mai correctament establerta. Aquest treball pretén establir l’existència de diferències en la força de l’espatlla entre diferents grups d’edat i entre sexes.
Resumo:
Large projects evaluation rises well known difficulties because -by definition- they modify the current price system; their public evaluation presents additional difficulties because they modify too existing shadow prices without the project. This paper analyzes -first- the basic methodologies applied until late 80s., based on the integration of projects in optimization models or, alternatively, based on iterative procedures with information exchange between two organizational levels. New methodologies applied afterwards are based on variational inequalities, bilevel programming and linear or nonlinear complementarity. Their foundations and different applications related with project evaluation are explored. As a matter of fact, these new tools are closely related among them and can treat more complex cases involving -for example- the reaction of agents to policies or the existence of multiple agents in an environment characterized by common functions representing demands or constraints on polluting emissions.
Resumo:
Correct positioning of the tibial component in total knee arthroplasty (TKA) must take into account both an optimal bone coverage (defined by a maximal cortical bearing with posteromedial and anterolateral support) and satisfactory patellofemoral tracking. Consequently, a compromise position must be found by the surgeon during the operation to simultaneously meet these two requirements. Moreover, tibial tray positioning depends upon the tibial torsion, which has been shown to act mainly in the proximal quarter of the tibia. Therefore, the correct application of the tibial tray is also theoretically related to the level of bone resection. In this study, we first quantified the torsional profile given by an optimal bone coverage for a symmetrical tibial tray design and for an asymmetrical one. Then, for the two types of tibial trays, we measured the angle difference between optimal bone coverage and an alignment on the middle of the tibial tubercule. Results showed that the values of the torsional profile given by the symmetrical tray were more scattered than those from the asymmetrical one. However, determination of the mean differential angle between the position providing optimal bone coverage and the one providing the best patellofemoral tracking indicated that the symmetrical prosthetic tray offered the best compromise between these two requirements. Although the tibiofemoral joint is known to be asymmetric in both shape and dimension, the asymmetrical tray chosen in this study was found to fulfill this compromise with more difficulty.
Resumo:
Disseny i implementació d’ una aplicació per incorporar noves funcionalitats a la gestió de propietats immobles. En els últims anys, el lloguer d’apartaments i cases ha augmentat el seu volum de reserves a través d’Internet, i és per això, que moltes empreses del sector han dedicat molts esforços en potenciar aquest canal i fer tots els tràmits a través de la xarxa. L’hivern del 2009 l’empresa a qui va dirigida aquest projecte, va decidir fer aquest salt i irrompre a Internet. En aquell moment es van plantejar diferents opcions per a la gestió d’immobles: fer una aplicació totalment nova i a mida, comprar una aplicació ja feta a empreses especialitzades o adaptar una aplicació gratuïta a les necessitats de l’empresa. Analitzats amb detall els tres casos, es va decidir implementar la tercera opció ja que satisfeia les necessitats de l’empresa i amb modificacions o adaptacions aconseguiria el resultat esperat. L’aplicació gratuïta és el “com-property” i la seva posada en marxa va ser en dos mesos
Resumo:
Diagnosis of several neurological disorders is based on the detection of typical pathological patterns in the electroencephalogram (EEG). This is a time-consuming task requiring significant training and experience. Automatic detection of these EEG patterns would greatly assist in quantitative analysis and interpretation. We present a method, which allows automatic detection of epileptiform events and discrimination of them from eye blinks, and is based on features derived using a novel application of independent component analysis. The algorithm was trained and cross validated using seven EEGs with epileptiform activity. For epileptiform events with compensation for eyeblinks, the sensitivity was 65 +/- 22% at a specificity of 86 +/- 7% (mean +/- SD). With feature extraction by PCA or classification of raw data, specificity reduced to 76 and 74%, respectively, for the same sensitivity. On exactly the same data, the commercially available software Reveal had a maximum sensitivity of 30% and concurrent specificity of 77%. Our algorithm performed well at detecting epileptiform events in this preliminary test and offers a flexible tool that is intended to be generalized to the simultaneous classification of many waveforms in the EEG.
Resumo:
Business processes designers take into account the resources that the processes would need, but, due to the variable cost of certain parameters (like energy) or other circumstances, this scheduling must be done when business process enactment. In this report we formalize the energy aware resource cost, including time and usage dependent rates. We also present a constraint programming approach and an auction-based approach to solve the mentioned problem including a comparison of them and a comparison of the proposed algorithms for solving them
Resumo:
Report for the scientific sojourn carried out at the l’ Institute for Computational Molecular Science of the Temple University, United States, from 2010 to 2012. Two-component systems (TCS) are used by pathogenic bacteria to sense the environment within a host and activate mechanisms related to virulence and antimicrobial resistance. A prototypical example is the PhoQ/PhoP system, which is the major regulator of virulence in Salmonella. Hence, PhoQ is an attractive target for the design of new antibiotics against foodborne diseases. Inhibition of the PhoQ-mediated bacterial virulence does not result in growth inhibition, presenting less selective pressure for the generation of antibiotic resistance. Moreover, PhoQ is a histidine kinase (HK) and it is absent in animals. Nevertheless, the design of satisfactory HK inhibitors has been proven to be a challenge. To compete with the intracellular ATP concentrations, the affinity of a HK inhibidor must be in the micromolar-nanomolar range, whereas the current lead compounds have at best millimolar affinities. Moreover, the drug selectivity depends on the conformation of a highly variable loop, referred to as the “ATP-lid, which is difficult to study by X-Ray crystallography due to its flexibility. I have investigated the binding of different HK inhibitors to PhoQ. In particular, all-atom molecular dynamics simulations have been combined with enhanced sampling techniques in order to provide structural and dynamic information of the conformation of the ATP-lid. Transient interactions between these drugs and the ATP-lid have been identified and the free energy of the different binding modes has been estimated. The results obtained pinpoint the importance of protein flexibility in the HK-inhibitor binding, and constitute a first step in developing more potent and selective drugs. The computational resources of the hosting institution as well as the experience of the members of the group in drug binding and free energy methods have been crucial to carry out this work.
Resumo:
The cDNA encoding the NH2-terminal 589 amino acids of the extracellular domain of the human polymeric immunoglobulin receptor was inserted into transfer vectors to generate recombinant baculo- and vaccinia viruses. Following infection of insect and mammalian cells, respectively, the resulting truncated protein corresponding to human secretory component (hSC) was secreted with high efficiency into serum-free culture medium. The Sf9 insect cell/baculovirus system yielded as much as 50 mg of hSC/liter of culture, while the mammalian cells/vaccinia virus system produced up to 10 mg of protein/liter. The M(r) of recombinant hSC varied depending on the cell line in which it was expressed (70,000 in Sf9 cells and 85-95,000 in CV-1, TK- 143B and HeLa). These variations in M(r) resulted from different glycosylation patterns, as evidenced by endoglycosidase digestion. Efficient single-step purification of the recombinant protein was achieved either by concanavalin A affinity chromatography or by Ni(2+)-chelate affinity chromatography, when a 6xHis tag was engineered to the carboxyl terminus of hSC. Recombinant hSC retained the capacity to specifically reassociate with dimeric IgA purified from hybridoma cells.
Resumo:
In a number of programs for gene structure prediction in higher eukaryotic genomic sequences, exon prediction is decoupled from gene assembly: a large pool of candidate exons is predicted and scored from features located in the query DNA sequence, and candidate genes are assembled from such a pool as sequences of nonoverlapping frame-compatible exons. Genes are scored as a function of the scores of the assembled exons, and the highest scoring candidate gene is assumed to be the most likely gene encoded by the query DNA sequence. Considering additive gene scoring functions, currently available algorithms to determine such a highest scoring candidate gene run in time proportional to the square of the number of predicted exons. Here, we present an algorithm whose running time grows only linearly with the size of the set of predicted exons. Polynomial algorithms rely on the fact that, while scanning the set of predicted exons, the highest scoring gene ending in a given exon can be obtained by appending the exon to the highest scoring among the highest scoring genes ending at each compatible preceding exon. The algorithm here relies on the simple fact that such highest scoring gene can be stored and updated. This requires scanning the set of predicted exons simultaneously by increasing acceptor and donor position. On the other hand, the algorithm described here does not assume an underlying gene structure model. Indeed, the definition of valid gene structures is externally defined in the so-called Gene Model. The Gene Model specifies simply which gene features are allowed immediately upstream which other gene features in valid gene structures. This allows for great flexibility in formulating the gene identification problem. In particular it allows for multiple-gene two-strand predictions and for considering gene features other than coding exons (such as promoter elements) in valid gene structures.
Resumo:
To date, state-of-the-art seismic material parameter estimates from multi-component sea-bed seismic data are based on the assumption that the sea-bed consists of a fully elastic half-space. In reality, however, the shallow sea-bed generally consists of soft, unconsolidated sediments that are characterized by strong to very strong seismic attenuation. To explore the potential implications, we apply a state-of-the-art elastic decomposition algorithm to synthetic data for a range of canonical sea-bed models consisting of a viscoelastic half-space of varying attenuation. We find that in the presence of strong seismic attenuation, as quantified by Q-values of 10 or less, significant errors arise in the conventional elastic estimation of seismic properties. Tests on synthetic data indicate that these errors can be largely avoided by accounting for the inherent attenuation of the seafloor when estimating the seismic parameters. This can be achieved by replacing the real-valued expressions for the elastic moduli in the governing equations in the parameter estimation by their complex-valued viscoelastic equivalents. The practical application of our parameter procedure yields realistic estimates of the elastic seismic material properties of the shallow sea-bed, while the corresponding Q-estimates seem to be biased towards too low values, particularly for S-waves. Given that the estimation of inelastic material parameters is notoriously difficult, particularly in the immediate vicinity of the sea-bed, this is expected to be of interest and importance for civil and ocean engineering purposes.
Resumo:
Studies of large sets of SNP data have proven to be a powerful tool in the analysis of the genetic structure of human populations. In this work, we analyze genotyping data for 2,841 SNPs in 12 Sub-Saharan African populations, including a previously unsampled region of south-eastern Africa (Mozambique). We show that robust results in a world-wide perspective can be obtained when analyzing only 1,000 SNPs. Our main results both confirm the results of previous studies, and show new and interesting features in Sub-Saharan African genetic complexity. There is a strong differentiation of Nilo-Saharans, much beyond what would be expected by geography. Hunter-gatherer populations (Khoisan and Pygmies) show a clear distinctiveness with very intrinsic Pygmy (and not only Khoisan) genetic features. Populations of the West Africa present an unexpected similarity among them, possibly the result of a population expansion. Finally, we find a strong differentiation of the south-eastern Bantu population from Mozambique, which suggests an assimilation of a pre-Bantu substrate by Bantu speakers in the region.
Resumo:
We present a new technique for audio signal comparison based on tonal subsequence alignment and its application to detect cover versions (i.e., different performances of the same underlying musical piece). Cover song identification is a task whose popularity has increased in the Music Information Retrieval (MIR) community along in the past, as it provides a direct and objective way to evaluate music similarity algorithms.This article first presents a series of experiments carried outwith two state-of-the-art methods for cover song identification.We have studied several components of these (such as chroma resolution and similarity, transposition, beat tracking or Dynamic Time Warping constraints), in order to discover which characteristics would be desirable for a competitive cover song identifier. After analyzing many cross-validated results, the importance of these characteristics is discussed, and the best-performing ones are finally applied to the newly proposed method. Multipleevaluations of this one confirm a large increase in identificationaccuracy when comparing it with alternative state-of-the-artapproaches.