963 resultados para integer disaggregation
Resumo:
We calculate the anomalous dimensions of operators with large global charge J in certain strongly coupled conformal field theories in three dimensions, such as the O(2) model and the supersymmetric fixed point with a single chiral superfield and a W = Φ3 superpotential. Working in a 1/J expansion, we find that the large-J sector of both examples is controlled by a conformally invariant effective Lagrangian for a Goldstone boson of the global symmetry. For both these theories, we find that the lowest state with charge J is always a scalar operator whose dimension ΔJ satisfies the sum rule J2ΔJ−(J22+J4+316)ΔJ−1−(J22+J4+316)ΔJ+1=0.04067 up to corrections that vanish at large J . The spectrum of low-lying excited states is also calculable explcitly: for example, the second-lowest primary operator has spin two and dimension ΔJ+3√. In the supersymmetric case, the dimensions of all half-integer-spin operators lie above the dimensions of the integer-spin operators by a gap of order J+12. The propagation speeds of the Goldstone waves and heavy fermions are 12√ and ±12 times the speed of light, respectively. These values, including the negative one, are necessary for the consistent realization of the superconformal symmetry at large J.
Resumo:
Libraries of learning objects may serve as basis for deriving course offerings that are customized to the needs of different learning communities or even individuals. Several ways of organizing this course composition process are discussed. Course composition needs a clear understanding of the dependencies between the learning objects. Therefore we discuss the metadata for object relationships proposed in different standardization projects and especially those suggested in the Dublin Core Metadata Initiative. Based on these metadata we construct adjacency matrices and graphs. We show how Gozinto-type computations can be used to determine direct and indirect prerequisites for certain learning objects. The metadata may also be used to define integer programming models which can be applied to support the instructor in formulating his specifications for selecting objects or which allow a computer agent to automatically select learning objects. Such decision models could also be helpful for a learner navigating through a library of learning objects. We also sketch a graph-based procedure for manual or automatic sequencing of the learning objects.
Resumo:
Musik von Carl Maria von Weber. Gedicht von Friedrich Kind. Nach Friedrich Kinds Jubel Cantate gedichtet von A. Wendt
Resumo:
Public preferences for policy are formed in a little-understood process that is not adequately described by traditional economic theory of choice. In this paper I suggest that U.S. aggregate support for health reform can be modeled as tradeoffs among a small number of behavioral values and the stage of policy development. The theory underlying the model is based on Samuelson, et al.'s (1986) work and Wilke's (1991) elaboration of it as the Greed/Efficiency/Fairness (GEF) hypothesis of motivation in the management of resource dilemmas, and behavioral economics informed by Kahneman and Thaler's prospect theory. ^ The model developed in this paper employs ordered probit econometric techniques applied to data derived from U.S. polls taken from 1990 to mid-2003 that measured support for health reform proposals. Outcome data are four-tiered Likert counts; independent variables are dummies representing the presence or absence of operationalizations of each behavioral variable, along with an integer representing policy process stage. Marginal effects of each independent variable predict how support levels change on triggering that variable. Model estimation results indicate a vanishingly small likelihood that all coefficients are zero and all variables have signs expected from model theory. ^ Three hypotheses were tested: support will drain from health reform policy as it becomes increasingly well-articulated and approaches enactment; reforms appealing to fairness through universal health coverage will enjoy a higher degree of support than those targeted more narrowly; health reforms calling for government operation of the health finance system will achieve lower support than those that do not. Model results support the first and last hypotheses. Contrary to expectations, universal health care proposals did not provide incremental support beyond those targeted to “deserving” populations—children, elderly, working families. In addition, loss of autonomy (e.g. restrictions on choice of care giver) is found to be the “third rail” of health reform with significantly-reduced support. When applied to a hypothetical health reform in which an employer-mandated Medical Savings Account policy is the centerpiece, the model predicts support that may be insufficient to enactment. These results indicate that the method developed in the paper may prove valuable to health policy designers. ^
Resumo:
Digital terrain models (DTM) typically contain large numbers of postings, from hundreds of thousands to billions. Many algorithms that run on DTMs require topological knowledge of the postings, such as finding nearest neighbors, finding the posting closest to a chosen location, etc. If the postings are arranged irregu- larly, topological information is costly to compute and to store. This paper offers a practical approach to organizing and searching irregularly-space data sets by presenting a collection of efficient algorithms (O(N),O(lgN)) that compute important topological relationships with only a simple supporting data structure. These relationships include finding the postings within a window, locating the posting nearest a point of interest, finding the neighborhood of postings nearest a point of interest, and ordering the neighborhood counter-clockwise. These algorithms depend only on two sorted arrays of two-element tuples, holding a planimetric coordinate and an integer identification number indicating which posting the coordinate belongs to. There is one array for each planimetric coordinate (eastings and northings). These two arrays cost minimal overhead to create and store but permit the data to remain arranged irregularly.
Resumo:
A characterization of a property of binary relations is of finite type if it is stated in terms of ordered T-tuples of alternatives for some positive integer T. A characterization of finite type can be used to determine in polynomial time whether a binary relation over a finite set has the property characterized. Unfortunately, Pareto representability in R2 has no characterization of finite type (Knoblauch, 2002). This result is generalized below Rl, l larger than 2. The method of proof is applied to other properties of binary relations.
Resumo:
Objective. To measure the demand for primary care and its associated factors by building and estimating a demand model of primary care in urban settings.^ Data source. Secondary data from 2005 California Health Interview Survey (CHIS 2005), a population-based random-digit dial telephone survey, conducted by the UCLA Center for Health Policy Research in collaboration with the California Department of Health Services, and the Public Health Institute between July 2005 and April 2006.^ Study design. A literature review was done to specify the demand model by identifying relevant predictors and indicators. CHIS 2005 data was utilized for demand estimation.^ Analytical methods. The probit regression was used to estimate the use/non-use equation and the negative binomial regression was applied to the utilization equation with the non-negative integer dependent variable.^ Results. The model included two equations in which the use/non-use equation explained the probability of making a doctor visit in the past twelve months, and the utilization equation estimated the demand for primary conditional on at least one visit. Among independent variables, wage rate and income did not affect the primary care demand whereas age had a negative effect on demand. People with college and graduate educational level were associated with 1.03 (p < 0.05) and 1.58 (p < 0.01) more visits, respectively, compared to those with no formal education. Insurance was significantly and positively related to the demand for primary care (p < 0.01). Need for care variables exhibited positive effects on demand (p < 0.01). Existence of chronic disease was associated with 0.63 more visits, disability status was associated with 1.05 more visits, and people with poor health status had 4.24 more visits than those with excellent health status. ^ Conclusions. The average probability of visiting doctors in the past twelve months was 85% and the average number of visits was 3.45. The study emphasized the importance of need variables in explaining healthcare utilization, as well as the impact of insurance, employment and education on demand. The two-equation model of decision-making, and the probit and negative binomial regression methods, was a useful approach to demand estimation for primary care in urban settings.^
Resumo:
IBAMar (http://www.ba.ieo.es/ibamar) is a regional database that puts together all physical and biochemical data obtained by multiparametric probes (CTDs equipped with different sensors), during the cruises managed by the Balearic Center of the Spanish Institute of Oceanography (COB-IEO). It has been recently extended to include data obtained with classical hydro casts using oceanographic Niskin or Nansen bottles. The result is a database that includes a main core of hydrographic data: temperature (T), salinity (S), dissolved oxygen (DO), fluorescence and turbidity; complemented by bio-chemical data: dissolved inorganic nutrients (phosphate, nitrate, nitrite and silicate) and chlorophyll-a. In IBAMar Database, different technologies and methodologies were used by different teams along the four decades of data sampling in the COB-IEO. Despite of this fact, data have been reprocessed using the same protocols, and a standard QC has been applied to each variable. Therefore it provides a regional database of homogeneous, good quality data. Data acquisition and quality control (QC): 94% of the data are CTDs Sbe911 and Sbe25. S and DO were calibrated on board using water samples, whenever a Rossetta was available (70% of the cases). All CTD data from Seabird CTDs were reviewed and post processed with the software provided by Sea-Bird Electronics. Data were averaged to get 1 dbar vertical resolution. General sampling methodology and pre processing are described in https://ibamardatabase.wordpress.com/home/). Manual QC include visual checks of metadata, duplicate data and outliers. Automatic QC include range check of variables by area (north of Balearic Islands, south of BI and Alboran Sea) and depth (27 standard levels), check for spikes and check for density inversions. Nutrients QC includes a preliminary control and a range check on the observed level of the data to detect outliers around objectively analyzed data fields. A quality flag is assigned as an integer number, depending on the result of the QC check.
Resumo:
Ocean Drilling Program (ODP) Leg 210 is one of very few deep-sea legs drilled along the eastern Canadian continental margin. Most other drilling on this margin has been carried out by the petroleum industry on the shallow-water regions of the Scotian shelf and the Grand Banks (see Doeven, 1983, for nannofossil studies). Deep Sea Drilling Project (DSDP) Leg 12 Site 111 and ODP Leg 105 Site 647 were drilled in the general vicinity of Leg 210 but recovered no appreciable Lower Cretaceous (Albian-Cenomanian) sediments. Site 111 yielded indurated limestones dated tentatively as late Albian-early Cenomanian, whereas Site 647 encountered no Albian-Cenomanian sediments. Two sites (Sites 1276 and 1277) were drilled during Leg 210 in the Newfoundland Basin with the primary objective of recovering basement rocks to elucidate the rifting history of the North Atlantic Basin. The location for Leg 210 was selected because it is conjugate to the Iberia margin, which was drilled extensively during DSDP/ODP Legs 47B, 103, 149, and 173. A secondary but equally important objective was to recover the overlying sediments with the purpose of studying the postrift sedimentation history of this margin.
Resumo:
A new technique for the harmonic analysis of current observations is described. It consists in applying a linear band pass filter which separates the various species and removes the contribution of non-tidal effects at intertidal frequencies. The tidal constituents are then evaluated through the method of least squares. In spite of the narrowness of the filter, only three days of data are lost through the filtering procedure and the only requirement on the data is that the time interval between samples be an integer fraction of one day. This technique is illustrated through the analysis of a few French current observations from the English Channel within the framework of INOUT. The characteristics of the main tidal constituents are given.
Resumo:
We present the first high-resolution (500 m × 500 m) gridded methane (CH4) emission inventory for Switzerland, which integrates the national emission totals reported to the United Nations Framework Convention on Climate Change (UNFCCC) and recent CH4 flux studies conducted by research groups across Switzerland. In addition to anthropogenic emissions, we also include natural and semi-natural CH4 fluxes, i.e., emissions from lakes and reservoirs, wetlands, wild animals as well as uptake by forest soils. National CH4 emissions were disaggregated using detailed geostatistical information on source locations and their spatial extent and process- or area-specific emission factors. In Switzerland, the highest CH4 emissions in 2011 originated from the agricultural sector (150 Gg CH4/yr), mainly produced by ruminants and manure management, followed by emissions from waste management (15 Gg CH4/yr) mainly from landfills and the energy sector (12 Gg CH4/yr), which was dominated by emissions from natural gas distribution. Compared to the anthropogenic sources, emissions from natural and semi-natural sources were relatively small (6 Gg CH4/yr), making up only 3 % of the total emissions in Switzerland. CH4 fluxes from agricultural soils were estimated to be not significantly different from zero (between -1.5 and 0 Gg CH4/yr), while forest soils are a CH4 sink (approx. -2.8 Gg CH4/yr), partially offsetting other natural emissions. Estimates of uncertainties are provided for the different sources, including an estimate of spatial disaggregation errors deduced from a comparison with a global (EDGAR v4.2) and a European CH4 inventory (TNO/MACC). This new spatially-explicit emission inventory for Switzerland will provide valuable input for regional scale atmospheric modeling and inverse source estimation.
Resumo:
Copper porphyrins have been recognized as natural constituents of marine sediments only within the past 5 years (Palmer and Baker, 1978, Science201, 49-51). In that report it was suggested that these pigments may derive from and be markers for oxidized terrestrial organic matter redeposited in the marine environment. In the present study we describe the distribution of copper porphyrins in sediments from several north Pacific and Gulf of California DSDP/IPQD sites (Legs 56,63,64). These allochthonous pigments have now been found to be accompanied by identical arrays of highly dealkylated nickel etioporphyrins. Evaluation of data from this and past studies clearly reveals that there is a strong carbon-number distribution similarity betweeen coincident Cu and Ni etioporphyrins. This homology match is taken as reflecting a common source for the tetrapyrrole ligands of this population of Cu and Ni chelates. Predepositional generation of these highly dealkylated etioporphyrins is concluded from the occurrence of these pigments in sediments continuing essentially all stages of in situ chlorophyll diagenesis (cf. Baker and Louda, 1983). That is, their presence is not regulated by the in situ diagenetic continuum. Thus, the highly dealkylated Cu and Ni etioporphyrins represent an 'allochthonous' background over which 'autochthonous' (viz. marine produced) chlorophyll derivatives are deposited and are undergoing in situ diagenesis.
Resumo:
We have studied the chemical zoning of plagioclase phenocrysts from the slow-spreading Mid-Atlantic Ridge and the intermediate-spreading rate Costa Rica Rift to obtain the time scales of magmatic processes beneath these ridges. The anorthite content, Mg, and Sr in plagioclase phenocrysts from the Mid-Atlantic Ridge can be interpreted as recording initial crystallisation from a primitive magma (~11 wt% MgO) in an open system. This was followed by crystal accumulation in a mush zone and later entrainment of crystals into the erupted magma. The initial magma crystallised plagioclase more anorthitic than those in equilibrium with any erupted basalt. Evidence that the crystals accumulated in a mush zone comes from both: (1) plagioclase rims that were in equilibrium with a Sr-poor melt requiring extreme differentiation; and (2) different crystals found in the same thin section having different histories. Diffusion modelling shows that crystal residence times in the mush were <140 years, whereas the interval between mush disaggregation and eruption was ?1.5 years. Zoning of anorthite content and Mg in plagioclase phenocrysts from the Costa Rica Rift show that they partially or completely equilibrated with a MgO-rich melt (>11 wt%). Partial equilibration in some crystals can be modelled as starting <1 year prior to eruption but for others longer times are required for complete equilibration. This variety of times is most readily explained if the mixing occurred in a mush zone. None of the plagioclase phenocrysts from the Costa Rica Rift that we studied have Mg contents in equilibrium with their host basalt even at their rims, requiring mixing into a much more evolved magma within days of eruption. In combination these observations suggest that at both intermediate- and slow-spreading ridges: (1) the chemical environment to which crystals are exposed changes on annual to decadal time scales; (2) plagioclase crystals record the existence of melts unlike those erupted; and (3) disaggregation of crystal mush zones appears to precede eruption, providing an efficient mechanism by which evolved interstitial melt can be mixed into erupted basalts.
Resumo:
Late Jurassic-early Cretaceous black shales and an overlying sequence of Albian-Campanian zeolitic claystones from the Falkland Plateau (DSDP/IPOD Leg 71, Site 511) were analyzed for tetrapyrrole pigment type and abundance. The "black shale" sequence was found to be rich in DPEP-series dominated free-base, nickel (Ni) and, to a lesser extent, vanadyl (V = 0) porphyrins. A low level of organic maturity (i.e. precatagenesis) is indicated for these strata as nickel chelation by free-base porphyrins is only 50-75% complete, proceeding down-hole to 627 meters sub-bottom. Electronic and mass spectral data reveal that the proposed benzo-DPEP (BD) and tetrahydrobenzo-DPEP (THBD) series are present in the free-base and Ni species, as well as the more usual occurrence in V = 0 porphyrin arrays. Highly reducing conditions are suggested by an abundance of the PAH perylene, substantial amounts of the THBD/BD series and a redox equilibrium between free-base DPEP and 7,8-dihydro-DPEP series, which exist in a 7:1 molar ratio. The Albian-Campanian claystone strata were found to be tetrapyrrolepoor, and those pigments present were typed as Cu/Ni highly dealkylated (C26 max.) etioporphyrins, thought to be derived via redeposition and oxidation of terrestrial organic matter (OM). Results from the present study are correlated to our past analyses of Jurassic-Cretaceous sediments from Atlantic margins in an effort to relate tetrapyrrole quality and quantity to basin evolution and OM sources in the proto-Atlantic.