912 resultados para Numerical Algorithms and Problems
Resumo:
Beyond the inherent technical challenges, current research into the three dimensional surface correspondence problem is hampered by a lack of uniform terminology, an abundance of application specific algorithms, and the absence of a consistent model for comparing existing approaches and developing new ones. This paper addresses these challenges by presenting a framework for analysing, comparing, developing, and implementing surface correspondence algorithms. The framework uses five distinct stages to establish correspondence between surfaces. It is general, encompassing a wide variety of existing techniques, and flexible, facilitating the synthesis of new correspondence algorithms. This paper presents a review of existing surface correspondence algorithms, and shows how they fit into the correspondence framework. It also shows how the framework can be used to analyse and compare existing algorithms and develop new algorithms using the framework's modular structure. Six algorithms, four existing and two new, are implemented using the framework. Each implemented algorithm is used to match a number of surface pairs. Results demonstrate that the correspondence framework implementations are faithful implementations of existing algorithms, and that powerful new surface correspondence algorithms can be created. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
We examine alcohol use in conjunction with ecstasy use and risk-taking behaviors among regular ecstasy users in every capital city in Australia. Data on drug use and risks were collected in 2004 from a national sample of 852 regular ecstasy users (persons who had used ecstasy at least monthly in the preceding 6 months). Users were grouped according to their typical alcohol use when using ecstasy: no use, consumption of between one and five standard drinks, and consumption of more than five drinks (binge alcohol use). The sample was young, well educated, and mainly working or studying. Approximately two thirds (65%) of the regular ecstasy users reported drinking alcohol when taking ecstasy. Of these, 69% reported usually consuming more than five standard drinks. Those who did not drink alcohol were more disadvantaged, with greater levels of unemployment, less education, higher rates of drug user treatment, and prison history. They were also more likely than those who drank alcohol when using ecstasy to be drug injectors and to be hepatitis C positive. Excluding alcohol, drug use patterns were similar between groups, although the no alcohol group used cannabis and methamphetamine more frequently. Binge drinkers were more likely to report having had three or more sexual partners in the past 6 months and were less likely to report having safe sex with casual partners while under the influence of drugs. Despite some evidence that the no alcohol group were more entrenched drug users, those who typically drank alcohol when taking ecstasy were as likely to report risks and problems associated with their drug use. It appears that regular ecstasy users who binge drink are placing themselves at increased sexual risk when under the influence of drugs. Safe sex messages should address the sexual risk associated with substance use and should be tailored to reducing alcohol consumption, particularly targeting heavy alcohol users. The study's limitations are noted.
Resumo:
Successful hearing aid fitting occurs when the person fitted wears the aid/s on a regular basis and reports benefit when the aid/s is used. A significant number of people fitted with unilateral or bilateral hearing aids for the first time do not continue to use one or both aids in the long term. In this paper, factors consistently found in previous research to be associated with unsuccessful fitting are explored; in particular, the negative attitudes of some clients towards hearing aids, their lack of motivation for seeking help, inability to identify goals for rehabilitation, and problems with the management of the devices. It is argued here that success in hearing aid fitting involves the same dynamics as found with other assistive technologies (e.g., wheelchairs, walking frames), and is dependent on a match between the characteristics of a prospective user, the technology itself, and the environments of use (Scherer, 2002). It is recommended that for clients who identify concerns about hearing aids, or who are unsure about when they would use them, and/or are likely to have problems with aid management, only one aid be fitted in the first instance, if hearing aid fitting is to proceed at all. Rehabilitation approaches to promote successful fitting are discussed in light of results obtained from a survey of clients who experienced both successful and unsuccessful aid fitting.
Resumo:
In this thesis work we develop a new generative model of social networks belonging to the family of Time Varying Networks. The importance of correctly modelling the mechanisms shaping the growth of a network and the dynamics of the edges activation and inactivation are of central importance in network science. Indeed, by means of generative models that mimic the real-world dynamics of contacts in social networks it is possible to forecast the outcome of an epidemic process, optimize the immunization campaign or optimally spread an information among individuals. This task can now be tackled taking advantage of the recent availability of large-scale, high-quality and time-resolved datasets. This wealth of digital data has allowed to deepen our understanding of the structure and properties of many real-world networks. Moreover, the empirical evidence of a temporal dimension in networks prompted the switch of paradigm from a static representation of graphs to a time varying one. In this work we exploit the Activity-Driven paradigm (a modeling tool belonging to the family of Time-Varying-Networks) to develop a general dynamical model that encodes fundamental mechanism shaping the social networks' topology and its temporal structure: social capital allocation and burstiness. The former accounts for the fact that individuals does not randomly invest their time and social interactions but they rather allocate it toward already known nodes of the network. The latter accounts for the heavy-tailed distributions of the inter-event time in social networks. We then empirically measure the properties of these two mechanisms from seven real-world datasets and develop a data-driven model, analytically solving it. We then check the results against numerical simulations and test our predictions with real-world datasets, finding a good agreement between the two. Moreover, we find and characterize a non-trivial interplay between burstiness and social capital allocation in the parameters phase space. Finally, we present a novel approach to the development of a complete generative model of Time-Varying-Networks. This model is inspired by the Kaufman's adjacent possible theory and is based on a generalized version of the Polya's urn. Remarkably, most of the complex and heterogeneous feature of real-world social networks are naturally reproduced by this dynamical model, together with many high-order topological properties (clustering coefficient, community structure etc.).
Resumo:
We derive a mean field algorithm for binary classification with Gaussian processes which is based on the TAP approach originally proposed in Statistical Physics of disordered systems. The theory also yields an approximate leave-one-out estimator for the generalization error which is computed with no extra computational cost. We show that from the TAP approach, it is possible to derive both a simpler 'naive' mean field theory and support vector machines (SVM) as limiting cases. For both mean field algorithms and support vectors machines, simulation results for three small benchmark data sets are presented. They show 1. that one may get state of the art performance by using the leave-one-out estimator for model selection and 2. the built-in leave-one-out estimators are extremely precise when compared to the exact leave-one-out estimate. The latter result is a taken as a strong support for the internal consistency of the mean field approach.
Resumo:
Phospholipids are complex and varied biomolecules that are susceptible to lipid peroxidation after attack by free radicals or electrophilic oxidants and can yield a large number of different oxidation products. There are many available methods for detecting phospholipid oxidation products, but also various limitations and problems. Electrospray ionization mass spectrometry allows the simultaneous but specific analysis of multiple species with good sensitivity and has a further advantage that it can be coupled to liquid chromatography for separation of oxidation products. Here, we explain the principles of oxidized phospholipid analysis by electrospray mass spectrometry and describe fragmentation routines for surveying the structural properties of the analytes, in particular precursor ion and neutral loss scanning. These allow targeted detection of phospholipid headgroups and identification of phospholipids containing hydroperoxides and chlorine, as well as the detection of some individual oxidation products by their specific fragmentation patterns. We describe instrument protocols for carrying out these survey routines on a QTrap5500 mass spectrometer and also for interfacing with reverse-phase liquid chromatography. The article highlights critical aspects of the analysis as well as some limitations of the methodology.
Resumo:
This paper reports on a new study conducted within a leading UK based (and US owned) car manufacturing company looking at the satisfaction between parties within a newly formed third party logistics (3PL) relationship. The study contains a two-way assessment of the relationship (i.e. the vehicle manufacturer’s Parts Supply and Logistics Operation’s assessment of the 3PL’s service and the 3PL’s assessment of the vehicle manufacturer’s relationship management ability). The study principally used an online SERVQUAL survey, (backed up with an on-line questionnaire, and face to face interviews) for data collection. The paper discusses the background and problems that have arisen in the relationship, the analysis of how each of the parties sees one other in terms of the service provided. Also, the findings and recommendations presented to management are outlined: these include factors such as the need for information sharing, reliability, flexibility, role specificity, trust and effective requirements management.
Resumo:
We propose and analyze two different Bayesian online algorithms for learning in discrete Hidden Markov Models and compare their performance with the already known Baldi-Chauvin Algorithm. Using the Kullback-Leibler divergence as a measure of generalization we draw learning curves in simplified situations for these algorithms and compare their performances.
Resumo:
This thesis investigates the physical behaviour of solitons in wavelength division multiplexed (WDM) systems with dispersion management in a wide range of dispersion regimes. Background material is presented to show how solitons propagate in optical fibres, and key problems associated with real systems are outlined. Problems due to collision induced frequency shifts are calculated using numerical simulation, and these results compared with analytical techniques where possible. Different two-step dispersion regimes, as well as the special cases of uniform and exponentially profiled systems, are identified and investigated. In shallow profile, the constituent second-order dispersions in the system are always close to the average soliton value. It is shown that collision-induced frequency shifts in WDM soliton transmission systems are reduced with increasing dispersion management. New resonances in the collision dynamics are illustrated, due to the relative motion induced by the dispersion map. Consideration of third-order dispersion is shown to modify the effects of collision-induced timing jitter and third-order compensation investigated. In all cases pseudo-phase-matched four-wave mixing was found to be insignificant compared to collision induced frequency shift in causing deterioration of data. It is also demonstrated that all these effects are additive with that of Gordon-Haus jitter.
Resumo:
The following thesis presents results obtained from both numerical simulation and laboratory experimentation (both of which were carried out by the author). When data is propagated along an optical transmission line some timing irregularities can occur such as timing jitter and phase wander. Traditionally these timing problems would have been corrected by converting the optical signal into the electrical domain and then compensating for the timing irregularity before converting the signal back into the optical domain. However, this thesis posses a potential solution to the problem by remaining completely in the optical domain, eliminating the need for electronics. This is desirable as not only does optical processing reduce the latency effect that their electronic counterpart have, it also holds the possibility of an increase in overall speed. A scheme was proposed which utilises the principle of wavelength conversion to dynamically convert timing irregularities (timing jitter and phase wander) into a change in wavelength (this occurs on a bit-by-bit level and so timing jitter and phase wander can be compensated for simultaneously). This was achieved by optically sampling a linearly chirped, locally generated clock source (the sampling function was achieved using a nonlinear optical loop mirror). The data, now with each bit or code word having a unique wavelength, is then propagated through a dispersion compensation module. The dispersion compensation effectively re-aligns the data in time and so thus, the timing irregularities are removed. The principle of operation was tested using computer simulation before being re-tested in a laboratory environment. A second stage was added to the device to create 3R regeneration. The second stage is used to simply convert the timing suppressed data back into a single wavelength. By controlling the relative timing displacement between stage one and stage two, the wavelength that is finally produced can be controlled.
Resumo:
The objective of this article is to give an overview of the history of the development and problems of gene therapy, while also considering the ethical and moral issues surrounding the application of the technology.
Resumo:
The aim of this thesis was to investigate the impact of changing values and attitudes toward work and the workplace in Britain, West Germany, France and Japan. A cross-national approach was adopted in order to gain a better understanding of differences and similarities in behaviour and to identify aspects specific to each society. Although the relationship between work and leisure has been thoroughly examined and there is a growing body of literature on changes in the values associated with these two phenomena, little research has been carried out into leisure at work. Studies of work time have tended to consider it as a homogeneous block, whereas recent research suggests that more attention should be devoted to unravelling the multiple uses of time at the workplace. The present study sought to review and analyse this new approach to the study of work time, and special attention is devoted to an examination of definitions of leisure, recreation, free time and work within the context of the workplace. The cross-cultural comparative approach gave rise to several problems due to the number of countries involved and the unusual combination of factors being investigated. The main difficulties were differences in the amount and quality of literature available, the non-comparability of existing data, definitions of concepts and socio-linguistic terms, and problems over access to organizations for fieldwork. Much of the literature generalizes about patterns of behaviour and few authors isolate factors specific to particular societies. In this thesis new empirical work is therefore used to ascertain the extent to which generalizations can be made from the literature and characteristics peculiar to each of the four countries identified. White-collar employees in large, broadly comparable companies were studied using identical questionnaires in the appropriate language. Respondents selected were men and women, aged between 20-65 years and either managers or non-managers. Patterns of leisure at work were found to be broadly similar in the national contexts, but with the Japanese and the West Germans experiencing the least leisure at work, and the British and the French perceiving the most. The general trend seems to be toward convergence of attitudes regarding leisure at work in the four countries. Explanations for variations in practice were sought within the wider societal contexts of each country.
Resumo:
With the advent of distributed computer systems with a largely transparent user interface, new questions have arisen regarding the management of such an environment by an operating system. One fertile area of research is that of load balancing, which attempts to improve system performance by redistributing the workload submitted to the system by the users. Early work in this field concentrated on static placement of computational objects to improve performance, given prior knowledge of process behaviour. More recently this has evolved into studying dynamic load balancing with process migration, thus allowing the system to adapt to varying loads. In this thesis, we describe a simulated system which facilitates experimentation with various load balancing algorithms. The system runs under UNIX and provides functions for user processes to communicate through software ports; processes reside on simulated homogeneous processors, connected by a user-specified topology, and a mechanism is included to allow migration of a process from one processor to another. We present the results of a study of adaptive load balancing algorithms, conducted using the aforementioned simulated system, under varying conditions; these results show the relative merits of different approaches to the load balancing problem, and we analyse the trade-offs between them. Following from this study, we present further novel modifications to suggested algorithms, and show their effects on system performance.
Resumo:
This study aims to investigate to what extent the views of the managers of the enterprises to be privatized are a barrier to smooth implementation of privatization as opposed to other problems. Accordingly, the research tackles two main issues: Identification and analysis of the major problems encountered in the implementation of the Egyptian privatization programme and at which level these problems exist while proposing different approaches to tackle them; and views of public sector top and middle-level managers regarding the main issues of privatization. The study relies upon a literature survey, interviews with stakeholders, a survey of managers' attitudes and several illustrative case studies. A model of "good practice" for the smooth and effective implementation of privatization has been designed. Practice in Egypt has then been studied and compared with the "good practice" model. Lack of strictness and firmness in implementing the announced privatization programme has been found to be a characteristic of Egyptian practice. This is partly attributable to the inadequacy of the programme and partly to the different obstacles to implementation. The main obstacles are doubtful desirability of privatization on the part of the managers at different implementation levels, resistance of stakeholders, in adequately of the legal framework governing privatization, redundant labour, lack of an efficient monitoring system allowing for accountability, inefficient marketing of privatization, ineffective communication, insufficient information at different levels and problems related to valuation and selling procedures. A large part of the thesis is concerned with SOE (State Owned Enterprise) managers' attitudes on and understanding of the privatization (appraised through surveys). Although most managers have stated their acceptance of privatization, many of their responses show that they do not accept selling SOEs. They understand privatization to include enterprise reform and restructuring, changing procedures and giving more authority to company executives, but not necessarily as selling SOEs. The majority of managers still see many issues that have to be addressed for smooth implementation of privatization e.g. insufficiency of information, incompleteness of legal framework, restructuring and labour problems. The main contribution to knowledge of this thesis is the study of problems of implementing privatization in developing countries especially managers' resistance to privatization as a major change, partly because of the threat it poses and partly because of the lack of understanding of privatization and implications of operating private businesses. A programme of persuading managers and offsetting the unfavourable effects is recommended as an outcome of the study. Five different phrases and words for the national Index to theses are: Egypt, privatization, implementation of privatization, problems of implementing privatization and managers' attitudes towards privatization.
Resumo:
Alginate is widely used as a viscosity enhancer in many different pharmaceutical formulations. The aim of this thesis is to quantitatively describe the functions of this polyelectrolyte in pharmaceutical systems. To do this the techniques used were Viscometry, Light Scattering, Continuous and Oscillatory Shear Rheometry, Numerical Analysis and Diffusion. Molecular characterization of the Alginate was carried out using Viscometry and Light Scattering to determine the molecular weight, the radius of gyration, the second virial coefficient and the Kuhn statistical segment length. The results showed good agreement with similar parameters obtained in previous studies. By blending Alginate with other polyelectrolytes, Xanthan Gum and 'Carbopol', in various proportions and with various methods of low and high shear preparation, a very wide range of dynamic rheological properties was found. Using oscillatory testing, the parameters often varied over several decades of magnitude. It was shown that the determination of the viscous and elastic components is particularly useful in describing the rheological 'profiles' of suspending agent blends and provides a step towards the non-empirical formulation of pharmaceutical disperse systems. Using numerical analysis of equations describing planar diffusion, it was shown that the analysis of drug release profiles alone does not provide unambiguous information about the mechanism of rate control. These principles were applied to the diffusion of Ibuprofen in Calcium Alginate gels. For diffusion in such non-Newtonian systems, emphasis was placed on the use of the elastic as well as the viscous component of viscoelasticity. It was found that the diffusion coefficients were relatively unaffected by increases in polymer concentration up to 5 per cent, yet the elasticities measured by oscillatory shear rheometry were increased. This was interpreted in the light of several theories of diffusion in gels.