958 resultados para Standard models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the dynamics of a growing crystalline facet where the growth mechanism is controlled by the geometry of the local curvature. A continuum model, in (2+1) dimensions, is developed in analogy with the Kardar-Parisi-Zhang (KPZ) model is considered for the purpose. Following standard coarse graining procedures, it is shown that in the large time, long distance limit, the continuum model predicts a curvature independent KPZ phase, thereby suppressing all explicit effects of curvature and local pinning in the system, in the "perturbative" limit. A direct numerical integration of this growth equation, in 1+1 dimensions, supports this observation below a critical parametric range, above which generic instabilities, in the form of isolated pillared structures lead to deviations from standard scaling behaviour. Possibilities of controlling this instability by introducing statistically "irrelevant" (in the sense of renormalisation groups) higher ordered nonlinearities have also been discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The task of smooth and stable decision rules construction in logical recognition models is considered. Logical regularities of classes are defined as conjunctions of one-place predicates that determine the membership of features values in an intervals of the real axis. The conjunctions are true on a special no extending subsets of reference objects of some class and are optimal. The standard approach of linear decision rules construction for given sets of logical regularities consists in realization of voting schemes. The weighting coefficients of voting procedures are done as heuristic ones or are as solutions of complex optimization task. The modifications of linear decision rules are proposed that are based on the search of maximal estimations of standard objects for their classes and use approximations of logical regularities by smooth sigmoid functions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62G08, 62P30.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Oxygen and carbon isotope measurements were carried out on tests of planktic foraminifers N. pachyderma (sin.) from eight sediment cores taken from the eastern Arctic Ocean, the Fram Strait, and the lceland Sea, in order to reconstruct Arctic Ocean and Norwegian-Greenland Sea circulation patterns and ice covers during the last 130,000 years. In addition, the influence of ice, temperature and salinity effects on the isotopic signal was quantified. Isotope measurements on foraminifers from sediment surface samples were used to elucidate the ecology of N. pachyderma (sin.). Changes in the oxygen and carbon isotope composition of N. pachyderma (sin.) from sediment surface samples document the horizontal and vertical changes of water mass boundaries controlled by water temperature and salinity, because N. pachyderma (sin.) shows drastic changes in depth habitats, depending on the water mass properties. It was able to be shown that in the investigated areas a regional and spatial apparent increase of the ice effect occurred. This happened especially during the termination I by direct advection of meltwaters from nearby continents or during the termination and in interglacials by supply of isotopically light water from rivers. A northwardly proceeding overprint of the 'global' ice effect, increasing from the Norwegian-Greenland Sea to the Arctic Ocean, was not able to be demonstrated. By means of a model the influence of temperature and salinity on the global ice volume signal during the last 130,000 years was recorded. In combination with the results of this study, the model was the basis for a reconstruction of the paleoceanographic development of the Arctic Ocean and the Norwegian-Greenland Sea during this time interval. The conception of a relatively thick and permanent sea ice cover in the Nordic Seas during glacial times should be replaced by the model of a seasonally and regionally highly variable ice cover. Only during isotope stage 5e may there have been a local deep water formation in the Fram Strait.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diffuse intrinsic pontine glioma (DIPG) is a rare and incurable brain tumor that arises in the brainstem of children predominantly between the ages of 6 and 8. Its intricate morphology and involvement of normal pons tissue precludes surgical resection, and the standard of care today remains fractionated radiation alone. In the past 30 years, there have been no significant advances made in the treatment of DIPG. This is largely because we lack good models of DIPG and therefore have little biological basis for treatment. In recent years, however, due to increased biopsy and acquisition of autopsy specimens, research is beginning to unravel the genetic and epigenetic drivers of DIPG. Insight gleaned from these studies has led to improvements in approaches to both model these tumors in the lab and to potentially treat them in the clinic. This review will detail the initial strides toward modeling DIPG in animals, which included allograft and xenograft rodent models using non-DIPG glioma cells. Important advances in the field came with the development of in vitro cell and in vivo xenograft models derived directly from autopsy material of DIPG patients or from human embryonic stem cells. Finally, we will summarize the progress made in the development of genetically engineered mouse models of DIPG. Cooperation of studies incorporating all of these modeling systems to both investigate the unique mechanisms of gliomagenesis in the brainstem and to test potential novel therapeutic agents in a preclinical setting will result in improvement in treatments for DIPG patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Implementing effective antenatal care models is a key global policy goal. However, the mechanisms of action of these multi-faceted models that would allow widespread implementation are seldom examined and poorly understood. In existing care model analyses there is little distinction between what is done, how it is done, and who does it. A new evidence-informed quality maternal and newborn care (QMNC) framework identifies key characteristics of quality care. This offers the opportunity to identify systematically the characteristics of care delivery that may be generalizable across contexts, thereby enhancing implementation. Our objective was to map the characteristics of antenatal care models tested in Randomised Controlled Trials (RCTs) to a new evidence-based framework for quality maternal and newborn care; thus facilitating the identification of characteristics of effective care.

Methods: A systematic review of RCTs of midwifery-led antenatal care models. Mapping and evaluation of these models’ characteristics to the QMNC framework using data extraction and scoring forms derived from the five framework components. Paired team members independently extracted data and conducted quality assessment using the QMNC framework and standard RCT criteria.

Results: From 13,050 citations initially retrieved we identified 17 RCTs of midwifery-led antenatal care models from Australia (7), the UK (4), China (2), and Sweden, Ireland, Mexico and Canada (1 each). QMNC framework scores ranged from 9 to 25 (possible range 0–32), with most models reporting fewer than half the characteristics associated with quality maternity care. Description of care model characteristics was lacking in many studies, but was better reported for the intervention arms. Organisation of care was the best-described component. Underlying values and philosophy of care were poorly reported.

Conclusions: The QMNC framework facilitates assessment of the characteristics of antenatal care models. It is vital to understand all the characteristics of multi-faceted interventions such as care models; not only what is done but why it is done, by whom, and how this differed from the standard care package. By applying the QMNC framework we have established a foundation for future reports of intervention studies so that the characteristics of individual models can be evaluated, and the impact of any differences appraised.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Target space duality is one of the most profound properties of string theory. However it customarily requires that the background fields satisfy certain invariance conditions in order to perform it consistently; for instance the vector fields along the directions that T-duality is performed have to generate isometries. In the present paper we examine in detail the possibility to perform T-duality along non-isometric directions. In particular, based on a recent work of Kotov and Strobl, we study gauged 2D sigma models where gauge invariance for an extended set of gauge transformations imposes weaker constraints than in the standard case, notably the corresponding vector fields are not Killing. This formulation enables us to follow a procedure analogous to the derivation of the Buscher rules and obtain two dual models, by integrating out once the Lagrange multipliers and once the gauge fields. We show that this construction indeed works in non-trivial cases by examining an explicit class of examples based on step 2 nilmanifolds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation proposes statistical methods to formulate, estimate and apply complex transportation models. Two main problems are part of the analyses conducted and presented in this dissertation. The first method solves an econometric problem and is concerned with the joint estimation of models that contain both discrete and continuous decision variables. The use of ordered models along with a regression is proposed and their effectiveness is evaluated with respect to unordered models. Procedure to calculate and optimize the log-likelihood functions of both discrete-continuous approaches are derived, and difficulties associated with the estimation of unordered models explained. Numerical approximation methods based on the Genz algortithm are implemented in order to solve the multidimensional integral associated with the unordered modeling structure. The problems deriving from the lack of smoothness of the probit model around the maximum of the log-likelihood function, which makes the optimization and the calculation of standard deviations very difficult, are carefully analyzed. A methodology to perform out-of-sample validation in the context of a joint model is proposed. Comprehensive numerical experiments have been conducted on both simulated and real data. In particular, the discrete-continuous models are estimated and applied to vehicle ownership and use models on data extracted from the 2009 National Household Travel Survey. The second part of this work offers a comprehensive statistical analysis of free-flow speed distribution; the method is applied to data collected on a sample of roads in Italy. A linear mixed model that includes speed quantiles in its predictors is estimated. Results show that there is no road effect in the analysis of free-flow speeds, which is particularly important for model transferability. A very general framework to predict random effects with few observations and incomplete access to model covariates is formulated and applied to predict the distribution of free-flow speed quantiles. The speed distribution of most road sections is successfully predicted; jack-knife estimates are calculated and used to explain why some sections are poorly predicted. Eventually, this work contributes to the literature in transportation modeling by proposing econometric model formulations for discrete-continuous variables, more efficient methods for the calculation of multivariate normal probabilities, and random effects models for free-flow speed estimation that takes into account the survey design. All methods are rigorously validated on both real and simulated data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spinal cord injury (SCI) is a devastating condition, which results from trauma to the cord, resulting in a primary injury response which leads to a secondary injury cascade, causing damage to both glial and neuronal cells. Following trauma, the central nervous system (CNS) fails to regenerate due to a plethora of both intrinsic and extrinsic factors. Unfortunately, these events lead to loss of both motor and sensory function and lifelong disability and care for sufferers of SCI. There have been tremendous advancements made in our understanding of the mechanisms behind axonal regeneration and remyelination of the damaged cord. These have provided many promising therapeutic targets. However, very few have made it to clinical application, which could potentially be due to inadequate understanding of compound mechanism of action and reliance on poor SCI models. This thesis describes the use of an established neural cell co-culture model of SCI as a medium throughput screen for compounds with potential therapeutic properties. A number of compounds were screened which resulted in a family of compounds, modified heparins, being taken forward for more intense investigation. Modified heparins (mHeps) are made up of the core heparin disaccharide unit with variable sulphation groups on the iduronic acid and glucosamine residues; 2-O-sulphate (C2), 6-O-sulphate (C6) and N-sulphate (N). 2-O-sulphated (mHep6) and N-sulphated (mHep7) heparin isomers were shown to promote both neurite outgrowth and myelination in the SCI model. It was found that both mHeps decreased oligodendrocyte precursor cell (OPC) proliferation and increased oligodendrocyte (OL) number adjacent to the lesion. However, there is a difference in the direct effects on the OL from each of the mHeps; mHep6 increased myelin internode length and mHep7 increased the overall cell size. It was further elucidated that these isoforms interact with and mediate both Wnt and FGF signalling. In OPC monoculture experiments FGF2 treated OPCs displayed increased proliferation but this effect was removed when co-treated with the mHeps. Therefore, suggesting that the mHeps interact with the ligand and inhibit FGF2 signalling. Additionally, it was shown that both mHeps could be partially mediating their effects through the Wnt pathway. mHep effects on both myelination and neurite outgrowth were removed when co-treated with a Wnt signalling inhibitor, suggesting cell signalling mediation by ligand immobilisation and signalling activation as a mechanistic action for the mHeps. However, the initial methods employed in this thesis were not sufficient to provide a more detailed study into the effects the mHeps have on neurite outgrowth. This led to the design and development of a novel microfluidic device (MFD), which provides a platform to study of axonal injury. This novel device is a three chamber device with two chambers converging onto a central open access chamber. This design allows axons from two points of origin to enter a chamber which can be subjected to injury, thus providing a platform in which targeted axonal injury and the regenerative capacity of a compound study can be performed. In conclusion, this thesis contributes to and advances the study of SCI in two ways; 1) identification and investigation of a novel set of compounds with potential therapeutic potential i.e. desulphated modified heparins. These compounds have multiple therapeutic properties and could revolutionise both the understanding of the basic pathological mechanisms underlying SCI but also be a powered therapeutic option. 2) Development of a novel microfluidic device to study in greater detail axonal biology, specifically, targeted axonal injury and treatment, providing a more representative model of SCI than standard in vitro models. Therefore, the MFD could lead to advancements and the identification of factors and compounds relating to axonal regeneration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Complex singlet extension of the Standard Model (CxSM) is the simplest extension that provides scenarios for Higgs pair production with different masses. The model has two interesting phases: the dark matter phase, with a Standard Model-like Higgs boson, a new scalar and a dark matter candidate; and the broken phase, with all three neutral scalars mixing. In the latter phase Higgs decays into a pair of two different Higgs bosons are possible. In this study we analyse Higgs-to-Higgs decays in the framework of singlet extensions of the Standard Model (SM), with focus on the CxSM. After demonstrating that scenarios with large rates for such chain decays are possible we perform a comparison between the NMSSM and the CxSM. We find that, based on Higgs-to-Higgs decays, the only possibility to distinguish the two models at the LHC run 2 is through final states with two different scalars. This conclusion builds a strong case for searches for final states with two different scalars at the LHC run 2. Finally, we propose a set of benchmark points for the real and complex singlet extensions to be tested at the LHC run 2. They have been chosen such that the discovery prospects of the involved scalars are maximised and they fulfil the dark matter constraints. Furthermore, for some of the points the theory is stable up to high energy scales. For the computation of the decay widths and branching ratios we developed the Fortran code sHDECAY, which is based on the implementation of the real and complex singlet extensions of the SM in HDECAY.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In our research we investigate the output accuracy of discrete event simulation models and agent based simulation models when studying human centric complex systems. In this paper we focus on human reactive behaviour as it is possible in both modelling approaches to implement human reactive behaviour in the model by using standard methods. As a case study we have chosen the retail sector, and here in particular the operations of the fitting room in the women wear department of a large UK department store. In our case study we looked at ways of determining the efficiency of implementing new management policies for the fitting room operation through modelling the reactive behaviour of staff and customers of the department. First, we have carried out a validation experiment in which we compared the results from our models to the performance of the real system. This experiment also allowed us to establish differences in output accuracy between the two modelling methods. In a second step a multi-scenario experiment was carried out to study the behaviour of the models when they are used for the purpose of operational improvement. Overall we have found that for our case study example both, discrete event simulation and agent based simulation have the same potential to support the investigation into the efficiency of implementing new management policies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We estimate a dynamic model of mortgage default for a cohort of Colombian debtors between 1997 and 2004. We use the estimated model to study the effects on default of a class of policies that affected the evolution of mortgage balances in Colombia during the 1990's. We propose a framework for estimating dynamic behavioral models accounting for the presence of unobserved state variables that are correlated across individuals and across time periods. We extend the standard literature on the structural estimation of dynamic models by incorporating an unobserved common correlated shock that affects all individuals' static payoffs and the dynamic continuation payoffs associated with different decisions. Given a standard parametric specification the dynamic problem, we show that the aggregate shocks are identified from the variation in the observed aggregate behavior. The shocks and their transition are separately identified, provided there is enough cross-sectionavl ariation of the observeds tates.