952 resultados para Numerical example


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study a class of models of correlated random networks in which vertices are characterized by hidden variables controlling the establishment of edges between pairs of vertices. We find analytical expressions for the main topological properties of these models as a function of the distribution of hidden variables and the probability of connecting vertices. The expressions obtained are checked by means of numerical simulations in a particular example. The general model is extended to describe a practical algorithm to generate random networks with an a priori specified correlation structure. We also present an extension of the class, to map nonequilibrium growing networks to networks with hidden variables that represent the time at which each vertex was introduced in the system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traffic safety engineers are among the early adopters of Bayesian statistical tools for analyzing crash data. As in many other areas of application, empirical Bayes methods were their first choice, perhaps because they represent an intuitively appealing, yet relatively easy to implement alternative to purely classical approaches. With the enormous progress in numerical methods made in recent years and with the availability of free, easy to use software that permits implementing a fully Bayesian approach, however, there is now ample justification to progress towards fully Bayesian analyses of crash data. The fully Bayesian approach, in particular as implemented via multi-level hierarchical models, has many advantages over the empirical Bayes approach. In a full Bayesian analysis, prior information and all available data are seamlessly integrated into posterior distributions on which practitioners can base their inferences. All uncertainties are thus accounted for in the analyses and there is no need to pre-process data to obtain Safety Performance Functions and other such prior estimates of the effect of covariates on the outcome of interest. In this slight, fully Bayesian methods may well be less costly to implement and may result in safety estimates with more realistic standard errors. In this manuscript, we present the full Bayesian approach to analyzing traffic safety data and focus on highlighting the differences between the empirical Bayes and the full Bayes approaches. We use an illustrative example to discuss a step-by-step Bayesian analysis of the data and to show some of the types of inferences that are possible within the full Bayesian framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traffic safety engineers are among the early adopters of Bayesian statistical tools for analyzing crash data. As in many other areas of application, empirical Bayes methods were their first choice, perhaps because they represent an intuitively appealing, yet relatively easy to implement alternative to purely classical approaches. With the enormous progress in numerical methods made in recent years and with the availability of free, easy to use software that permits implementing a fully Bayesian approach, however, there is now ample justification to progress towards fully Bayesian analyses of crash data. The fully Bayesian approach, in particular as implemented via multi-level hierarchical models, has many advantages over the empirical Bayes approach. In a full Bayesian analysis, prior information and all available data are seamlessly integrated into posterior distributions on which practitioners can base their inferences. All uncertainties are thus accounted for in the analyses and there is no need to pre-process data to obtain Safety Performance Functions and other such prior estimates of the effect of covariates on the outcome of interest. In this light, fully Bayesian methods may well be less costly to implement and may result in safety estimates with more realistic standard errors. In this manuscript, we present the full Bayesian approach to analyzing traffic safety data and focus on highlighting the differences between the empirical Bayes and the full Bayes approaches. We use an illustrative example to discuss a step-by-step Bayesian analysis of the data and to show some of the types of inferences that are possible within the full Bayesian framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present an analytic and numerical study of the effects of external fluctuations in active media. Our analytical methodology transforms the initial stochastic partial differential equations into an effective set of deterministic reaction-diffusion equations. As a result we are able to explain and make quantitative predictions on the systematic and constructive effects of the noise, for example, target patterns created out of noise and traveling or spiral waves sustained by noise. Our study includes the case of realistic noises with temporal and spatial structures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Metabolic syndrome represents a grouping of risk factors closely linked to cardiovascular diseases and diabetes. At first, nuclear medicine has no direct application in cardiology at the level of primary prevention, but positron emission tomography is a non invasive imaging technique that can assess myocardial perfusion as well as the endothelium-dependent coronary vasomotion--a surrogate marker of cardiovascular event rate--thus finding an application in studying coronary physiopathology. As the prevalence of the metabolic syndrome is still unknown in Switzerland, we will estimate it from data available in the frame of a health promotion program. Based on the deleterious effect on the endothelium already observed with two components, we will estimate the number of persons at risk in Switzerland.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCTION: The importance of the micromovements in the mechanism of aseptic loosening is clinically difficult to evaluate. To complete the analysis of a series of total knee arthroplasties (TKA), we used a tridimensional numerical model to study the micromovements of the tibial implant. MATERIAL AND METHODS: Fifty one patients (with 57 cemented Porous Coated Anatomic TKAs) were reviewed (mean follow-up 4.5 year). Radiolucency at the tibial bone-cement interface was sought on the AP radiographs and divided in 7 areas. The distribution of the radiolucency was then correlated with the axis of the lower limb as measured on the orthoradiograms. The tridimensional numerical model is based on the finite element method. It allowed the measurement of the cemented prosthetic tibial implant's displacements and the micromovements generated at bone-ciment interface. A total load (2000 Newton) was applied at first vertically and asymetrically on the tibial plateau, thereby simulating an axial deviation of the lower limbs. The vector's posterior inclination then permitted the addition of a tangential component to the axial load. This type of effort is generated by complex biomechanical phenomena such as knee flexion. RESULTS: 81 per cent of the 57 knees had a radiolucent line of at least 1 mm, at one or more of the tibial cement-epiphysis jonctional areas. The distribution of these lucent lines showed that they came out more frequently at the periphery of the implant. The lucent lines appeared most often under the unloaded margin of the tibial plateau, when axial deviation of lower limbs was present. Numerical simulations showed that asymetrical loading on the tibial plateau induced a subsidence of the loaded margin (0-100 microns) and lifting off at the opposite border (0-70 microns). The postero-anterior tangential component induced an anterior displacement of the tibial implant (160-220 microns), and horizontal micromovements with non homogenous distribution at the bone-ciment interface (28-54 microns). DISCUSSION: Comparison of clinical and numerical results showed a relation between the development of radiolucent lines and the unloading of the tibial implant's margin. The deleterious effect of lower limbs' axial deviation is thereby proven. The irregular distribution of lucent lines under the tibial plateau was similar of the micromovements' repartition at the bone-cement interface when tangential forces were present. A causative relation between the two phenomenaes could not however be established. Numerical simulation is a truly useful method of study; it permits to calculate micromovements which are relative, non homogenous and of very low amplitude. However, comparative clinical studies remain as essential to ensure the credibility of results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AIM: Specific factors responsible for interindividual variability should be identified and their contribution quantified to improve the usefulness of biological monitoring. Among others, age is an easily identifiable determinant, which could play an important impact on biological variability. MATERIALS AND METHODS: A compartmental toxicokinetic model developed in previous studies for a series of metallic and organic compounds was applied to the description of age differences. Young male physiological and metabolic parameters, based on Reference Man information, were taken from preceding studies and were modified to take into account age based on available information about age differences. RESULTS: Numerical simulation using the kinetic model with the modified parameters indicates in some cases important differences due to age. The expected changes are mostly of the order of 10-20%, but differences up to 50% were observed in some cases. CONCLUSION: These differences appear to depend on the chemical and on the biological entity considered. Further work should be done to improve our estimates of these parameters, by considering for example uncertainty and variability in these parameters. [Authors]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Small or decreasing populations call for emergency actions like, for example, captive breeding programs. Such programs aim at rapidly increasing population sizes in order to reduce the loss of genetic variability and to avoid possible Allee effects. The Lesser Kestrel Falco naumanni is one of the species that is currently supported in several captive breeding programs at various locations. Here, we model the demographic and genetic consequences of potential management strategies that are based on offspring sex ratio manipulation. Increased population growth could be achieved by manipulating female conditions and/or male attractiveness in the captive breeders and consequently shifting the offspring sex ratio towards more female offspring, which are then used for reintroduction. Fragmenting populations into wild-breeding and captive-breeding demes and manipulating population sex ratio both immediately increase the inbreeding coefficient in the next generation (i.e. decrease N-e) but may, in the long term, reduce the loss of genetic variability if population growth is restricted by the number of females. We use the Lesser Kestrel and the wealth of information that is available on this species to predict the long-term consequences of various kinds of sex-ratio manipulation. We find that, in our example and possibly in many other cases, a sex-ratio manipulation that seems realistic could have a beneficial effect on the captive breeding program. However, the possible long-term costs and benefits of such measures need to be carefully optimized.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present an efficient numerical scheme for the recently introduced geodesic active fields (GAF) framework for geometric image registration. This framework considers the registration task as a weighted minimal surface problem. Hence, the data-term and the regularization-term are combined through multiplication in a single, parametrization invariant and geometric cost functional. The multiplicative coupling provides an intrinsic, spatially varying and data-dependent tuning of the regularization strength, and the parametrization invariance allows working with images of nonflat geometry, generally defined on any smoothly parametrizable manifold. The resulting energy-minimizing flow, however, has poor numerical properties. Here, we provide an efficient numerical scheme that uses a splitting approach; data and regularity terms are optimized over two distinct deformation fields that are constrained to be equal via an augmented Lagrangian approach. Our approach is more flexible than standard Gaussian regularization, since one can interpolate freely between isotropic Gaussian and anisotropic TV-like smoothing. In this paper, we compare the geodesic active fields method with the popular Demons method and three more recent state-of-the-art algorithms: NL-optical flow, MRF image registration, and landmark-enhanced large displacement optical flow. Thus, we can show the advantages of the proposed FastGAF method. It compares favorably against Demons, both in terms of registration speed and quality. Over the range of example applications, it also consistently produces results not far from more dedicated state-of-the-art methods, illustrating the flexibility of the proposed framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis examines the interplay between state regulation and the way organisations define performance. Performance is generally understood to be a multidimensional concept, but the extent to which its different facets are shaped by regulation remains an understudied question. This thesis aims to address this question and provide at least a partial answer to it. To do so, it examines whether the level of regulation amplifies or abates the multidimensionality of regulated entities' performance definition, i.e. the way they define the concept of performance. The leading question is whether an organisation's performance definition can be associated with the regulatory intensity its environment confronts it with. Moreover, the study explores whether the type of ownership-public or private-plays a role in regard to how a regulated entity defines performance. In order to undertake this investigation, the thesis focuses on the performance definitions of organisations in six different sport betting and lottery regulations. Qualitative data is gathered from primary and secondary documents as well as through semi-structured interviews with chief executive officers (CEO), members of executive management and gambling experts in each of these countries. The thesis concludes that the performance definitions of the organisations under study are indeed multidimensional, as well as clearly influenced by their respective regulatory environments. However, not all performance dimensions identified in the literature are present, nor can they all be estimated to be part of the performance definition. In addition, the public-private difference in defining performance-as conceptualised in the literature- seems to be abated in a regulated environment. The central role played by regulation in regard to the multidimensionality of the performance definition partially outweighs the effect of the nature of ownership.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Regulatory gene networks contain generic modules, like those involving feedback loops, which are essential for the regulation of many biological functions (Guido et al. in Nature 439:856-860, 2006). We consider a class of self-regulated genes which are the building blocks of many regulatory gene networks, and study the steady-state distribution of the associated Gillespie algorithm by providing efficient numerical algorithms. We also study a regulatory gene network of interest in gene therapy, using mean-field models with time delays. Convergence of the related time-nonhomogeneous Markov chain is established for a class of linear catalytic networks with feedback loops.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the dramatic increase in the volume of experimental results in every domain of life sciences, assembling pertinent data and combining information from different fields has become a challenge. Information is dispersed over numerous specialized databases and is presented in many different formats. Rapid access to experiment-based information about well-characterized proteins helps predict the function of uncharacterized proteins identified by large-scale sequencing. In this context, universal knowledgebases play essential roles in providing access to data from complementary types of experiments and serving as hubs with cross-references to many specialized databases. This review outlines how the value of experimental data is optimized by combining high-quality protein sequences with complementary experimental results, including information derived from protein 3D-structures, using as an example the UniProt knowledgebase (UniProtKB) and the tools and links provided on its website ( http://www.uniprot.org/ ). It also evokes precautions that are necessary for successful predictions and extrapolations.