802 resultados para empirical shell model


Relevância:

40.00% 40.00%

Publicador:

Resumo:

A customer is presumed to gravitate to a facility by the distance to it and the attractiveness of it. However regarding the location of the facility, the presumption is that the customer opts for the shortest route to the nearest facility.This paradox was recently solved by the introduction of the gravity p-median model. The model is yet to be implemented and tested empirically. We implemented the model in an empirical problem of locating locksmiths, vehicle inspections, and retail stores ofv ehicle spare-parts, and we compared the solutions with those of the p-median model. We found the gravity p-median model to be of limited use for the problem of locating facilities as it either gives solutions similar to the p-median model, or it gives unstable solutions due to a non-concave objective function.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We study the effects of population size in the Peck-Shell analysis of bank runs. We find that a contract featuring equal-treatment for almost all depositors of the same type approximates the optimum. Because the approximation also satisfies Green-Lin incentive constraints, when the planner discloses positions in the queue, welfare in these alternative specifications are sandwiched. Disclosure, however, it is not needed since our approximating contract is not subject to runs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In savannah and tropical grasslands, which account for 60% of grasslands worldwide, a large share of ecosystem carbon is located below ground due to high root:shoot ratios. Temporal variations in soil CO2 efflux (R-S) were investigated in a grassland of coastal Congo over two years. The objectives were (1) to identify the main factors controlling seasonal variations in R-S and (2) to develop a semi-empirical model describing R-S and including a heterotrophic component (R-H) and an autotrophic component (R-A). Plant above-ground activity was found to exert strong control over soil respiration since 71% of seasonal R-S variability was explained by the quantity of photosynthetically active radiation absorbed (APAR) by the grass canopy. We tested an additive model including a parameter enabling R-S partitioning into R-A and R-H. Assumptions underlying this model were that R-A mainly depended on the amount of photosynthates allocated below ground and that microbial and root activity was mostly controlled by soil temperature and soil moisture. The model provided a reasonably good prediction of seasonal variations in R-S (R-2 = 0.85) which varied between 5.4 mu mol m(-2) s(-1) in the wet season and 0.9 mu mol m(-2) s(-1) at the end of the dry season. The model was subsequently used to obtain annual estimates of R-S, R-A and R-H. In accordance with results reported for other tropical grasslands, we estimated that R-H accounted for 44% of R-S, which represented a flux similar to the amount of carbon brought annually to the soil from below-ground litter production. Overall, this study opens up prospects for simulating the carbon budget of tropical grasslands on a large scale using remotely sensed data. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A detailed characterization of a X-ray Si(Li) detector was performed to obtain the energy dependence of efficiency in the photon energy range of 6.4 - 59.5 keV. which was measured and reproduced by Monte Carlo (MC) simulations. Significant discrepancies between MC and experimental values were found when lhe manufacturer parameters of lhe detector were used in lhe simulation. A complete Computerized Tomagraphy (CT) detector scan allowed to find the correct crystal dimensions and position inside the capsule. The computed efficiencies with the resulting detector model differed with the measured values no more than 10% in most of the energy range.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background The reduction in the amount of food available for European avian scavengers as a consequence of restrictive public health policies is a concern for managers and conservationists. Since 2002, the application of several sanitary regulations has limited the availability of feeding resources provided by domestic carcasses, but theoretical studies assessing whether the availability of food resources provided by wild ungulates are enough to cover energetic requirements are lacking. Methodology/Findings We assessed food provided by a wild ungulate population in two areas of NE Spain inhabited by three vulture species and developed a P System computational model to assess the effects of the carrion resources provided on their population dynamics. We compared the real population trend with to a hypothetical scenario in which only food provided by wild ungulates was available. Simulation testing of the model suggests that wild ungulates constitute an important food resource in the Pyrenees and the vulture population inhabiting this area could grow if only the food provided by wild ungulates would be available. On the contrary, in the Pre-Pyrenees there is insufficient food to cover the energy requirements of avian scavenger guilds, declining sharply if biomass from domestic animals would not be available. Conclusions/Significance Our results suggest that public health legislation can modify scavenger population trends if a large number of domestic ungulate carcasses disappear from the mountains. In this case, food provided by wild ungulates could be not enough and supplementary feeding could be necessary if other alternative food resources are not available (i.e. the reintroduction of wild ungulates), preferably in European Mediterranean scenarios sharing similar and socio-economic conditions where there are low densities of wild ungulates. Managers should anticipate the conservation actions required by assessing food availability and the possible scenarios in order to make the most suitable decisions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Model-based calibration of steady-state engine operation is commonly performed with highly parameterized empirical models that are accurate but not very robust, particularly when predicting highly nonlinear responses such as diesel smoke emissions. To address this problem, and to boost the accuracy of more robust non-parametric methods to the same level, GT-Power was used to transform the empirical model input space into multiple input spaces that simplified the input-output relationship and improved the accuracy and robustness of smoke predictions made by three commonly used empirical modeling methods: Multivariate Regression, Neural Networks and the k-Nearest Neighbor method. The availability of multiple input spaces allowed the development of two committee techniques: a 'Simple Committee' technique that used averaged predictions from a set of 10 pre-selected input spaces chosen by the training data and the "Minimum Variance Committee" technique where the input spaces for each prediction were chosen on the basis of disagreement between the three modeling methods. This latter technique equalized the performance of the three modeling methods. The successively increasing improvements resulting from the use of a single best transformed input space (Best Combination Technique), Simple Committee Technique and Minimum Variance Committee Technique were verified with hypothesis testing. The transformed input spaces were also shown to improve outlier detection and to improve k-Nearest Neighbor performance when predicting dynamic emissions with steady-state training data. An unexpected finding was that the benefits of input space transformation were unaffected by changes in the hardware or the calibration of the underlying GT-Power model.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Stemmatology, or the reconstruction of the transmission history of texts, is a field that stands particularly to gain from digital methods. Many scholars already take stemmatic approaches that rely heavily on computational analysis of the collated text (e.g. Robinson and O’Hara 1996; Salemans 2000; Heikkilä 2005; Windram et al. 2008 among many others). Although there is great value in computationally assisted stemmatology, providing as it does a reproducible result and allowing access to the relevant methodological process in related fields such as evolutionary biology, computational stemmatics is not without its critics. The current state-of-the-art effectively forces scholars to choose between a preconceived judgment of the significance of textual differences (the Lachmannian or neo-Lachmannian approach, and the weighted phylogenetic approach) or to make no judgment at all (the unweighted phylogenetic approach). Some basis for judgment of the significance of variation is sorely needed for medieval text criticism in particular. By this, we mean that there is a need for a statistical empirical profile of the text-genealogical significance of the different sorts of variation in different sorts of medieval texts. The rules that apply to copies of Greek and Latin classics may not apply to copies of medieval Dutch story collections; the practices of copying authoritative texts such as the Bible will most likely have been different from the practices of copying the Lives of local saints and other commonly adapted texts. It is nevertheless imperative that we have a consistent, flexible, and analytically tractable model for capturing these phenomena of transmission. In this article, we present a computational model that captures most of the phenomena of text variation, and a method for analysis of one or more stemma hypotheses against the variation model. We apply this method to three ‘artificial traditions’ (i.e. texts copied under laboratory conditions by scholars to study the properties of text variation) and four genuine medieval traditions whose transmission history is known or deduced in varying degrees. Although our findings are necessarily limited by the small number of texts at our disposal, we demonstrate here some of the wide variety of calculations that can be made using our model. Certain of our results call sharply into question the utility of excluding ‘trivial’ variation such as orthographic and spelling changes from stemmatic analysis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The use of biomarkers to infer drug response in patients is being actively pursued, yet significant challenges with this approach, including the complicated interconnection of pathways, have limited its application. Direct empirical testing of tumor sensitivity would arguably provide a more reliable predictive value, although it has garnered little attention largely due to the technical difficulties associated with this approach. We hypothesize that the application of recently developed microtechnologies, coupled to more complex 3-dimensional cell cultures, could provide a model to address some of these issues. As a proof of concept, we developed a microfluidic device where spheroids of the serous epithelial ovarian cancer cell line TOV112D are entrapped and assayed for their chemoresponse to carboplatin and paclitaxel, two therapeutic agents routinely used for the treatment of ovarian cancer. In order to index the chemoresponse, we analyzed the spatiotemporal evolution of the mortality fraction, as judged by vital dyes and confocal microscopy, within spheroids subjected to different drug concentrations and treatment durations inside the microfluidic device. To reflect microenvironment effects, we tested the effect of exogenous extracellular matrix and serum supplementation during spheroid formation on their chemotherapeutic response. Spheroids displayed augmented chemoresistance in comparison to monolayer culturing. This resistance was further increased by the simultaneous presence of both extracellular matrix and high serum concentration during spheroid formation. Following exposure to chemotherapeutics, cell death profiles were not uniform throughout the spheroid. The highest cell death fraction was found at the center of the spheroid and the lowest at the periphery. Collectively, the results demonstrate the validity of the approach, and provide the basis for further investigation of chemotherapeutic responses in ovarian cancer using microfluidics technology. In the future, such microdevices could provide the framework to assay drug sensitivity in a timeframe suitable for clinical decision making.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Objective. To measure the demand for primary care and its associated factors by building and estimating a demand model of primary care in urban settings.^ Data source. Secondary data from 2005 California Health Interview Survey (CHIS 2005), a population-based random-digit dial telephone survey, conducted by the UCLA Center for Health Policy Research in collaboration with the California Department of Health Services, and the Public Health Institute between July 2005 and April 2006.^ Study design. A literature review was done to specify the demand model by identifying relevant predictors and indicators. CHIS 2005 data was utilized for demand estimation.^ Analytical methods. The probit regression was used to estimate the use/non-use equation and the negative binomial regression was applied to the utilization equation with the non-negative integer dependent variable.^ Results. The model included two equations in which the use/non-use equation explained the probability of making a doctor visit in the past twelve months, and the utilization equation estimated the demand for primary conditional on at least one visit. Among independent variables, wage rate and income did not affect the primary care demand whereas age had a negative effect on demand. People with college and graduate educational level were associated with 1.03 (p < 0.05) and 1.58 (p < 0.01) more visits, respectively, compared to those with no formal education. Insurance was significantly and positively related to the demand for primary care (p < 0.01). Need for care variables exhibited positive effects on demand (p < 0.01). Existence of chronic disease was associated with 0.63 more visits, disability status was associated with 1.05 more visits, and people with poor health status had 4.24 more visits than those with excellent health status. ^ Conclusions. The average probability of visiting doctors in the past twelve months was 85% and the average number of visits was 3.45. The study emphasized the importance of need variables in explaining healthcare utilization, as well as the impact of insurance, employment and education on demand. The two-equation model of decision-making, and the probit and negative binomial regression methods, was a useful approach to demand estimation for primary care in urban settings.^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

EOT11a is a global (E)mpirical (O)cean (T)ide model derived in 2011 by residual analysis of multi-mission satellite (a)ltimeter data. EOT11a includes amplitudes and phases of the main astronomical tides M2, S2, N2, K2, 2N2, O1, K1, P2, and Q1, the non-linear constituent M4, the long period tides Mm and Mf, and the radiational tide S1. Ocean tides as well as loading tides are provided. EOT11a was computed by means of residual tidal analysis of multi-mission altimeter data from TOPEX/Poseidon, ERS-2, ENVISAT, and Jason-1/2, as far as acquired between September 1992 and April 2010. The resolution of 7.5'x7.5' is identical with FES2004 which was used as reference model for the residual tide analysis. The development of EOT11a was funded by the Deutsche Forschungsgemeinschaft (DFG) under grant BO1228/6-2.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Research on the impact that work instability has on workers has the limitation of assess the relations among different variables separately, without examining the possible mediation relationships that can exists between them. The aim of this article is to test a conceptual model of the mediating relations between the uneasiness due to work instability and the psychological impact, in the framework of interactive stress theory, conducting a Path Analysis. 191 workers participated on the study, with a mean age of 31 years-old (SD = 11). Results showed that the proposed model didn't fit to the data. Alternative models were explored, consistent with the original conceptual model and the empiric evidence. A new causal model is proposed, where Uneasiness due to Work Instability as an independent variable, Personal Strain and Personal Resources as intervenient variables, and Anger, Hopelessness, and Satisfaction as dependent ones. The theoretical and empirical importance of the resulting model is discussed.