1000 resultados para Applications for positions


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Text in Spanish.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Shipping list no.: 91-442-P.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

"Eleventh printing, September, 1948."

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

"First impression, October, 1935."

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cover title.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Shipping list no.: 89-124-P.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Ship tracking systems allow Maritime Organizations that are concerned with the Safety at Sea to obtain information on the current location and route of merchant vessels. Thanks to Space technology in recent years the geographical coverage of the ship tracking platforms has increased significantly, from radar based near-shore traffic monitoring towards a worldwide picture of the maritime traffic situation. The long-range tracking systems currently in operations allow the storage of ship position data over many years: a valuable source of knowledge about the shipping routes between different ocean regions. The outcome of this Master project is a software prototype for the estimation of the most operated shipping route between any two geographical locations. The analysis is based on the historical ship positions acquired with long-range tracking systems. The proposed approach makes use of a Genetic Algorithm applied on a training set of relevant ship positions extracted from the long-term storage tracking database of the European Maritime Safety Agency (EMSA). The analysis of some representative shipping routes is presented and the quality of the results and their operational applications are assessed by a Maritime Safety expert.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article examines the determinants of positional incongruence between pre-election statements and post-election behaviour in the Swiss parliament between 2003 and 2009. The question is examined at the individual MP level, which is appropriate for dispersion-of-powers systems like Switzerland. While the overall rate of political congruence reaches about 85%, a multilevel logit analysis detects the underlying factors which push or curb a candidate's propensity to change his or her mind once elected. The results show that positional changes are more likely when (1) MPs are freshmen, (2) individual voting behaviour is invisible to the public, (3) the electoral district magnitude is not small, (4) the vote is not about a party's core issue, (5) the MP belongs to a party which is located in the political centre, and (6) if the pre-election statement dissents from the majority position of the legislative party group. Of these factors, the last one is paramount.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To what extent do Voting Advice Applications (VAA) have an influence on voting behaviour and to what extent should providers be hold accountable for such tools? This paper puts forward some empirical evidence from the Swiss VAA smartvote. The enormous popularity of smartvote in the last national elections in 2007 and the feedback of users and candidates let us come to the conclusion that smartvote is more than a toy and likely to have an influence on the voting decisions. Since Swiss citizens not only vote for parties but also for candidates, and the voting recommendation of smartvote is based on the political positions of the candidates, smartvote turns out to be particularly helpful. Political scientists must not keep their hands off such tools. Scientific research is needed to understand their functioning and possibilities to manipulate elections. On the bases of a legal study we come to the conclusion, that a science driven way of setting up such tools is essential for their legitimacy. However, we do not believe that there is a single best way of setting up such a tool and rather support a market like solution with different competing tools, provided they meet minimal standards like transparency and equal access for all parties and candidates. Once the process of selecting candidates and parties are directly linked to the act of voting, all these questions will become even more salient.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dual-energy X-ray absorptiometry (DXA) is commonly used in the care of patients for diagnostic classification of osteoporosis, low bone mass (osteopenia), or normal bone density; assessment of fracture risk; and monitoring changes in bone density over time. The development of other technologies for the evaluation of skeletal health has been associated with uncertainties regarding their applications in clinical practice. Quantitative ultrasound (QUS), a technology for measuring properties of bone at peripheral skeletal sites, is more portable and less expensive than DXA, without the use of ionizing radiation. The proliferation of QUS devices that are technologically diverse, measuring and reporting variable bone parameters in different ways, examining different skeletal sites, and having differing levels of validating data for association with DXA-measured bone density and fracture risk, has created many challenges in applying QUS for use in clinical practice. The International Society for Clinical Densitometry (ISCD) 2007 Position Development Conference (PDC) addressed clinical applications of QUS for fracture risk assessment, diagnosis of osteoporosis, treatment initiation, monitoring of treatment, and quality assurance/quality control. The ISCD Official Positions on QUS resulting from this PDC, the rationale for their establishment, and recommendations for further study are presented here.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

3 Summary 3. 1 English The pharmaceutical industry has been facing several challenges during the last years, and the optimization of their drug discovery pipeline is believed to be the only viable solution. High-throughput techniques do participate actively to this optimization, especially when complemented by computational approaches aiming at rationalizing the enormous amount of information that they can produce. In siiico techniques, such as virtual screening or rational drug design, are now routinely used to guide drug discovery. Both heavily rely on the prediction of the molecular interaction (docking) occurring between drug-like molecules and a therapeutically relevant target. Several softwares are available to this end, but despite the very promising picture drawn in most benchmarks, they still hold several hidden weaknesses. As pointed out in several recent reviews, the docking problem is far from being solved, and there is now a need for methods able to identify binding modes with a high accuracy, which is essential to reliably compute the binding free energy of the ligand. This quantity is directly linked to its affinity and can be related to its biological activity. Accurate docking algorithms are thus critical for both the discovery and the rational optimization of new drugs. In this thesis, a new docking software aiming at this goal is presented, EADock. It uses a hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with .the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 R around the center of mass of the ligand position in the crystal structure, and conversely to other benchmarks, our algorithms was fed with optimized ligand positions up to 10 A root mean square deviation 2MSD) from the crystal structure. This validation illustrates the efficiency of our sampling heuristic, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best-ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures in this benchmark could be explained by the presence of crystal contacts in the experimental structure. EADock has been used to understand molecular interactions involved in the regulation of the Na,K ATPase, and in the activation of the nuclear hormone peroxisome proliferatoractivated receptors a (PPARa). It also helped to understand the action of common pollutants (phthalates) on PPARy, and the impact of biotransformations of the anticancer drug Imatinib (Gleevec®) on its binding mode to the Bcr-Abl tyrosine kinase. Finally, a fragment-based rational drug design approach using EADock was developed, and led to the successful design of new peptidic ligands for the a5ß1 integrin, and for the human PPARa. In both cases, the designed peptides presented activities comparable to that of well-established ligands such as the anticancer drug Cilengitide and Wy14,643, respectively. 3.2 French Les récentes difficultés de l'industrie pharmaceutique ne semblent pouvoir se résoudre que par l'optimisation de leur processus de développement de médicaments. Cette dernière implique de plus en plus. de techniques dites "haut-débit", particulièrement efficaces lorsqu'elles sont couplées aux outils informatiques permettant de gérer la masse de données produite. Désormais, les approches in silico telles que le criblage virtuel ou la conception rationnelle de nouvelles molécules sont utilisées couramment. Toutes deux reposent sur la capacité à prédire les détails de l'interaction moléculaire entre une molécule ressemblant à un principe actif (PA) et une protéine cible ayant un intérêt thérapeutique. Les comparatifs de logiciels s'attaquant à cette prédiction sont flatteurs, mais plusieurs problèmes subsistent. La littérature récente tend à remettre en cause leur fiabilité, affirmant l'émergence .d'un besoin pour des approches plus précises du mode d'interaction. Cette précision est essentielle au calcul de l'énergie libre de liaison, qui est directement liée à l'affinité du PA potentiel pour la protéine cible, et indirectement liée à son activité biologique. Une prédiction précise est d'une importance toute particulière pour la découverte et l'optimisation de nouvelles molécules actives. Cette thèse présente un nouveau logiciel, EADock, mettant en avant une telle précision. Cet algorithme évolutionnaire hybride utilise deux pressions de sélections, combinées à une gestion de la diversité sophistiquée. EADock repose sur CHARMM pour les calculs d'énergie et la gestion des coordonnées atomiques. Sa validation a été effectuée sur 37 complexes protéine-ligand cristallisés, incluant 11 protéines différentes. L'espace de recherche a été étendu à une sphère de 151 de rayon autour du centre de masse du ligand cristallisé, et contrairement aux comparatifs habituels, l'algorithme est parti de solutions optimisées présentant un RMSD jusqu'à 10 R par rapport à la structure cristalline. Cette validation a permis de mettre en évidence l'efficacité de notre heuristique de recherche car des modes d'interactions présentant un RMSD inférieur à 2 R par rapport à la structure cristalline ont été classés premier pour 68% des complexes. Lorsque les cinq meilleures solutions sont prises en compte, le taux de succès grimpe à 78%, et 92% lorsque la totalité de la dernière génération est prise en compte. La plupart des erreurs de prédiction sont imputables à la présence de contacts cristallins. Depuis, EADock a été utilisé pour comprendre les mécanismes moléculaires impliqués dans la régulation de la Na,K ATPase et dans l'activation du peroxisome proliferatoractivated receptor a (PPARa). Il a également permis de décrire l'interaction de polluants couramment rencontrés sur PPARy, ainsi que l'influence de la métabolisation de l'Imatinib (PA anticancéreux) sur la fixation à la kinase Bcr-Abl. Une approche basée sur la prédiction des interactions de fragments moléculaires avec protéine cible est également proposée. Elle a permis la découverte de nouveaux ligands peptidiques de PPARa et de l'intégrine a5ß1. Dans les deux cas, l'activité de ces nouveaux peptides est comparable à celles de ligands bien établis, comme le Wy14,643 pour le premier, et le Cilengitide (PA anticancéreux) pour la seconde.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Le sujet principal de cette thèse porte sur les mesures de risque. L'objectif général est d'investiguer certains aspects des mesures de risque dans les applications financières. Le cadre théorique de ce travail est celui des mesures cohérentes de risque telle que définie dans Artzner et al (1999). Mais ce n'est pas la seule classe de mesure du risque que nous étudions. Par exemple, nous étudions aussi quelques aspects des "statistiques naturelles de risque" (en anglais natural risk statistics) Kou et al (2006) et des mesures convexes du risque Follmer and Schied(2002). Les contributions principales de cette thèse peuvent être regroupées selon trois axes: allocation de capital, évaluation des risques et capital requis et solvabilité. Dans le chapitre 2 nous caractérisons les mesures de risque avec la propriété de Lebesgue sur l'ensemble des processus bornés càdlàg (continu à droite, limité à gauche). Cette caractérisation nous permet de présenter deux applications dans l'évaluation des risques et l'allocation de capital. Dans le chapitre 3, nous étendons la notion de statistiques naturelles de risque à l'espace des suites infinies. Cette généralisation nous permet de construire de façon cohérente des mesures de risque pour des bases de données de n'importe quelle taille. Dans le chapitre 4, nous discutons le concept de "bonnes affaires" (en anglais Good Deals), pour notamment caractériser les situations du marché où ces positions pathologiques sont présentes. Finalement, dans le chapitre 5, nous essayons de relier les trois chapitres en étendant la définition de "bonnes affaires" dans un cadre plus large qui comprendrait les mesures de risque analysées dans les chapitres 2 et 3.