916 resultados para Compactification and String Models
Resumo:
Environmental data are spatial, temporal, and often come with many zeros. In this paper, we included space–time random effects in zero-inflated Poisson (ZIP) and ‘hurdle’ models to investigate haulout patterns of harbor seals on glacial ice. The data consisted of counts, for 18 dates on a lattice grid of samples, of harbor seals hauled out on glacial ice in Disenchantment Bay, near Yakutat, Alaska. A hurdle model is similar to a ZIP model except it does not mix zeros from the binary and count processes. Both models can be used for zero-inflated data, and we compared space–time ZIP and hurdle models in a Bayesian hierarchical model. Space–time ZIP and hurdle models were constructed by using spatial conditional autoregressive (CAR) models and temporal first-order autoregressive (AR(1)) models as random effects in ZIP and hurdle regression models. We created maps of smoothed predictions for harbor seal counts based on ice density, other covariates, and spatio-temporal random effects. For both models predictions around the edges appeared to be positively biased. The linex loss function is an asymmetric loss function that penalizes overprediction more than underprediction, and we used it to correct for prediction bias to get the best map for space–time ZIP and hurdle models.
Resumo:
As the area of nanotechnology continues to grow, the development of new nanomaterials with interesting physical and electronic properties and improved characterization techniques are several areas of research that will be remain vital for continued improvement of devices and the understanding in nanoscale phenomenon. In this dissertation, the chemical vapor deposition synthesis of rare earth (RE) compounds is described in detail. In general, the procedure involves the vaporization of a REClx (RE = Y, La, Ce, Pr, Nd, Sm, Gd, Tb, Dy, Ho) in the presence of hydride phase precursors such as decaborane and ammonia at high temperatures and low pressures. The vapor-liquid-solid mechanism was used in combination with the chemical vapor deposition process to synthesize single crystalline rare earth hexaboride nanostructures. The crystallographic orientation of as-synthesized rare earth hexaboride nanostructures and gadolinium nitride thin films was controlled by judicious choice of specific growth substrates and modeled by analyzing x-ray diffraction powder patterns and crystallographic models. The rare earth hexaboride nanostructures were then implemented into two existing technologies to enhance their characterization capabilities. First, the rare earth hexaboride nanowires were used as a test material for the development of a TEM based local electrode atom probe tomography (LEAP) technique. This technique provided some of the first quantitative compositional information of the rare earth hexaboride systems. Second, due to the rigidity and excellent conductivity of the rare earth hexaborides, nanostructures were grown onto tungsten wires for the development of robust, oxidation resistant nanomanipulator electronic probes for semiconductor device failure analysis.
Resumo:
Lessons from around the world; What does it matter about early childhood education? Why the controversy about public support for early childhood education? What process or system should be used to determine what works in early education? Can the same process be used to improve services? What is the role of government? Alternatives: 1. Consumers should determine… (What happens when private choices drive the market for early childhood services?) Observed quality of care in four Midwestern states; Parent data: “All things considered, how would you grade the quality of the care your child is receiving from his/her current caregiver?” Role of government What is a Quality Rating System? Ten states have implemented statewide systems (e.g. Colorado, Kentucky, Oklahoma, North Carolina) Findings 2. Objective science should determine… Firm findings from empirical research 3. Something else is needed: Some differences between Italian and American models. Teacher action research (and documentation) from a Reggio-inspired preschool in South Korea by Misuk Kim. Teacher Action Research at the Ruth Staples CDL. Can we now answer our opening questions? What process or system should be used to determine what is best for young children? Can the same process be used to improve the quality of services? Conclusions: The free market does not work well to determine quality in early education and care; Licensing, accreditation, and quality rating systems can help improve the market; Empirical research is useful for measuring what works; Teacher action research (reflective practice) is necessary for fostering continuous quality improvement. The tower of quality.
Resumo:
Background: Lynch syndrome (LS) is the most common form of inherited predisposition to colorectal cancer (CRC), accounting for 2-5% of all CRC. LS is an autosomal dominant disease characterized by mutations in the mismatch repair genes mutL homolog 1 (MLH1), mutS homolog 2 (MSH2), postmeiotic segregation increased 1 (PMS1), post-meiotic segregation increased 2 (PMS2) and mutS homolog 6 (MSH6). Mutation risk prediction models can be incorporated into clinical practice, facilitating the decision-making process and identifying individuals for molecular investigation. This is extremely important in countries with limited economic resources. This study aims to evaluate sensitivity and specificity of five predictive models for germline mutations in repair genes in a sample of individuals with suspected Lynch syndrome. Methods: Blood samples from 88 patients were analyzed through sequencing MLH1, MSH2 and MSH6 genes. The probability of detecting a mutation was calculated using the PREMM, Barnetson, MMRpro, Wijnen and Myriad models. To evaluate the sensitivity and specificity of the models, receiver operating characteristic curves were constructed. Results: Of the 88 patients included in this analysis, 31 mutations were identified: 16 were found in the MSH2 gene, 15 in the MLH1 gene and no pathogenic mutations were identified in the MSH6 gene. It was observed that the AUC for the PREMM (0.846), Barnetson (0.850), MMRpro (0.821) and Wijnen (0.807) models did not present significant statistical difference. The Myriad model presented lower AUC (0.704) than the four other models evaluated. Considering thresholds of >= 5%, the models sensitivity varied between 1 (Myriad) and 0.87 (Wijnen) and specificity ranged from 0 (Myriad) to 0.38 (Barnetson). Conclusions: The Barnetson, PREMM, MMRpro and Wijnen models present similar AUC. The AUC of the Myriad model is statistically inferior to the four other models.
Resumo:
We present two-dimensional (2D) two-particle angular correlations measured with the STAR detector on relative pseudorapidity eta and azimuth phi for charged particles from Au-Au collisions at root s(NN) = 62 and 200 GeV with transverse momentum p(t) >= 0.15 GeV/c, vertical bar eta vertical bar <= 1, and 2 pi in azimuth. Observed correlations include a same-side (relative azimuth <pi/2) 2D peak, a closely related away-side azimuth dipole, and an azimuth quadrupole conventionally associated with elliptic flow. The same-side 2D peak and away-side dipole are explained by semihard parton scattering and fragmentation (minijets) in proton-proton and peripheral nucleus-nucleus collisions. Those structures follow N-N binary-collision scaling in Au-Au collisions until midcentrality, where a transition to a qualitatively different centrality trend occurs within one 10% centrality bin. Above the transition point the number of same-side and away-side correlated pairs increases rapidly relative to binary-collision scaling, the eta width of the same-side 2D peak also increases rapidly (eta elongation), and the phi width actually decreases significantly. Those centrality trends are in marked contrast with conventional expectations for jet quenching in a dense medium. The observed centrality trends are compared to perturbative QCD predictions computed in HIJING, which serve as a theoretical baseline, and to the expected trends for semihard parton scattering and fragmentation in a thermalized opaque medium predicted by theoretical calculations and phenomenological models. We are unable to reconcile a semihard parton scattering and fragmentation origin for the observed correlation structure and centrality trends with heavy-ion collision scenarios that invoke rapid parton thermalization. If the collision system turns out to be effectively opaque to few-GeV partons the present observations would be inconsistent with the minijet picture discussed here. DOI: 10.1103/PhysRevC.86.064902
Resumo:
Effects of roads on wildlife and its habitat have been measured using metrics, such as the nearest road distance, road density, and effective mesh size. In this work we introduce two new indices: (1) Integral Road Effect (IRE), which measured the sum effects of points in a road at a fixed point in the forest; and (2) Average Value of the Infinitesimal Road Effect (AVIRE), which measured the average of the effects of roads at this point. IRE is formally defined as the line integral of a special function (the infinitesimal road effect) along the curves that model the roads, whereas AVIRE is the quotient of IRE by the length of the roads. Combining tools of ArcGIS software with a numerical algorithm, we calculated these and other road and habitat cover indices in a sample of points in a human-modified landscape in the Brazilian Atlantic Forest, where data on the abundance of two groups of small mammals (forest specialists and habitat generalists) were collected in the field. We then compared through the Akaike Information Criterion (AIC) a set of candidate regression models to explain the variation in small mammal abundance, including models with our two new road indices (AVIRE and IRE) or models with other road effect indices (nearest road distance, mesh size, and road density), and reference models (containing only habitat indices, or only the intercept without the effect of any variable). Compared to other road effect indices, AVIRE showed the best performance to explain abundance of forest specialist species, whereas the nearest road distance obtained the best performance to generalist species. AVIRE and habitat together were included in the best model for both small mammal groups, that is, higher abundance of specialist and generalist small mammals occurred where there is lower average road effect (less AVIRE) and more habitat. Moreover, AVIRE was not significantly correlated with habitat cover of specialists and generalists differing from the other road effect indices, except mesh size, which allows for separating the effect of roads from the effect of habitat on small mammal communities. We suggest that the proposed indices and GIS procedures could also be useful to describe other spatial ecological phenomena, such as edge effect in habitat fragments. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Dispersion of photoluminescent rare earth metal complexes in polymer matrices is of great interest due to the possibility of avoiding the saturation of the photoluminescent signal. The possibility of using a natural ionic conducting polymer matrix was investigated in this study. Samples of agar-based electrolytes containing europium picrate were prepared and characterized by physical and chemical analyses. The FTIR spectra indicated strong interaction of agar O-H and 3.6-anhydro-galactose C-O groups with glycerol and europium picrate. The DSC analyses revealed no glass transition temperature of the samples in the -60 to 250 degrees C range. From the thermogravimetry (TG), a thermal stability of the samples of up to 180 degrees C was stated. The membranes were subjected to ionic conductivity measurement, which provided the values of 2.6 x 10(-6) S/cm for the samples with acetic acid and 1.6 x 10(-5) S/cm for the samples without acetic acid. Moreover, the temperature-dependent ionic conductivity measurements revealed both Arrhenius and VTF models of the conductivity depending on the sample. Surface visualization through scanning electron microscopy (SEM) demonstrated good uniformity. The samples were also applied in small electrochromic devices and showed good electrochemical stability. The present work confirmed that these materials may perform as satisfactory multifunctional component layers in the field of electrochemical devices. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Objective: To validate the 2000 Bernstein Parsonnet (2000BP) and additive EuroSCORE (ES) to predict mortality in patients who underwent coronary bypass surgery and/or heart valve surgery at the Heart Institute, University of Sao Paulo (InCor/HC-FMUSP). Methods:A prospective observational design. We analyzed 3000 consecutive patients who underwent coronary bypass surgery and/or heart valve surgery, between May 2007 and July 2009 at the InCor/HC-FMUSP. Mortality was calculated with the 2000BP and ES models. The correlation between estimated mortality and observed mortality was validated by calibration and discrimination tests. Results: There were significant differences in the prevalence of risk factors between the study population, 2000BP and ES. Patients were stratified into five groups for 2000BP and three for the ES. In the validation of models, the ES showed good calibration (P = 0396), however, the 2000BP (P = 0.047) proved inadequate. In discrimination, the area under the ROC curve proved to be good for models, ES (0.79) and 2000BP (0.80). Conclusion: In the validation, 2000BP proved questionable and ES appropriate to predict mortality in patients who underwent coronary bypass surgery and/or heart valve surgery at the InCor/HC-FMUSP.
Resumo:
Isoprene is emitted from many terrestrial plants at high rates, accounting for an estimated 1/3 of annual global volatile organic compound emissions from all anthropogenic and biogenic sources combined. Through rapid photooxidation reactions in the atmosphere, isoprene is converted to a variety of oxidized hydrocarbons, providing higher order reactants for the production of organic nitrates and tropospheric ozone, reducing the availability of oxidants for the breakdown of radiatively active trace gases such as methane, and potentially producing hygroscopic particles that act as effective cloud condensation nuclei. However, the functional basis for plant production of isoprene remains elusive. It has been hypothesized that in the cell isoprene mitigates oxidative damage during the stress-induced accumulation of reactive oxygen species (ROS), but the products of isoprene-ROS reactions in plants have not been detected. Using pyruvate-2-13C leaf and branch feeding and individual branch and whole mesocosm flux studies, we present evidence that isoprene (i) is oxidized to methyl vinyl ketone and methacrolein (iox) in leaves and that iox/i emission ratios increase with temperature, possibly due to an increase in ROS production under high temperature and light stress. In a primary rainforest in Amazonia, we inferred significant in plant isoprene oxidation (despite the strong masking effect of simultaneous atmospheric oxidation), from its influence on the vertical distribution of iox uptake fluxes, which were shifted to low isoprene emitting regions of the canopy. These observations suggest that carbon investment in isoprene production is larger than that inferred from emissions alone and that models of tropospheric chemistry and biotachemistryclimate interactions should incorporate isoprene oxidation within both the biosphere and the atmosphere with potential implications for better understanding both the oxidizing power of the troposphere and forest response to climate change.
Resumo:
Dengue fever is a mosquito-borne viral disease estimated to cause about 230 million infections worldwide every year, of which 25,000 are fatal. Global incidence has risen rapidly in recent decades with some 3.6 billion people, over half of the world's population, now at risk, mainly in urban centres of the tropics and subtropics. Demographic and societal changes, in particular urbanization, globalization, and increased international travel, are major contributors to the rise in incidence and geographic expansion of dengue infections. Major research gaps continue to hamper the control of dengue. The European Commission launched a call under the 7th Framework Programme with the title of 'Comprehensive control of Dengue fever under changing climatic conditions'. Fourteen partners from several countries in Europe, Asia, and South America formed a consortium named 'DengueTools' to respond to the call to achieve better diagnosis, surveillance, prevention, and predictive models and improve our understanding of the spread of dengue to previously uninfected regions (including Europe) in the context of globalization and climate change. The consortium comprises 12 work packages to address a set of research questions in three areas: Research area 1: Develop a comprehensive early warning and surveillance system that has predictive capability for epidemic dengue and benefits from novel tools for laboratory diagnosis and vector monitoring. Research area 2: Develop novel strategies to prevent dengue in children. Research area 3: Understand and predict the risk of global spread of dengue, in particular the risk of introduction and establishment in Europe, within the context of parameters of vectorial capacity, global mobility, and climate change. In this paper, we report on the rationale and specific study objectives of 'DengueTools'. DengueTools is funded under the Health theme of the Seventh Framework Programme of the European Community, Grant Agreement Number: 282589 Dengue Tools.
Resumo:
During the last three decades, several predictive models have been developed to estimate the somatic production of macroinvertebrates. Although the models have been evaluated for their ability to assess the production of macrobenthos in different marine ecosystems, these approaches have not been applied specifically to sandy beach macrofauna and may not be directly applicable to this transitional environment. Hence, in this study, a broad literature review of sandy beach macrofauna production was conducted and estimates obtained with cohort-based and size-based methods were collected. The performance of nine models in estimating the production of individual populations from the sandy beach environment, evaluated for all taxonomic groups combined and for individual groups separately, was assessed, comparing the production predicted by the models to the estimates obtained from the literature (observed production). Most of the models overestimated population production compared to observed production estimates, whether for all populations combined or more specific taxonomic groups. However, estimates by two models developed by Cusson and Bourget provided best fits to measured production, and thus represent the best alternatives to the cohort-based and size-based methods in this habitat. The consistent performance of one of these Cusson and Bourget models, which was developed for the macrobenthos of sandy substrate habitats (C&B-SS), shows that the performance of a model does not depend on whether it was developed for a specific taxonomic group. Moreover, since some widely used models (e.g., the Robertson model) show very different responses when applied to the macrofauna of different marine environments (e.g., sandy beaches and estuaries), prior evaluation of these models is essential.
Resumo:
In this Letter we report the first results on pi(+/-), K-+/-, p, and (p) over bar production at midrapidity (vertical bar y vertical bar < 0.5) in central Pb-Pb collisions at root s(NN) = 2.76 TeV, measured by the ALICE experiment at the LHC. The p(T) distributions and yields are compared to previous results at root s(NN) = 200 GeV and expectations from hydrodynamic and thermal models. The spectral shapes indicate a strong increase of the radial flow velocity with root s(NN), which in hydrodynamic models is expected as a consequence of the increasing particle density. While the K/pi ratio is in line with predictions from the thermal model, the p/pi ratio is found to be lower by a factor of about 1.5. This deviation from thermal model expectations is still to be understood.
Resumo:
The objective of this paper is to model variations in test-day milk yields of first lactations of Holstein cows by RR using B-spline functions and Bayesian inference in order to fit adequate and parsimonious models for the estimation of genetic parameters. They used 152,145 test day milk yield records from 7317 first lactations of Holstein cows. The model established in this study was additive, permanent environmental and residual random effects. In addition, contemporary group and linear and quadratic effects of the age of cow at calving were included as fixed effects. Authors modeled the average lactation curve of the population with a fourth-order orthogonal Legendre polynomial. They concluded that a cubic B-spline with seven random regression coefficients for both the additive genetic and permanent environment effects was to be the best according to residual mean square and residual variance estimates. Moreover they urged a lower order model (quadratic B-spline with seven random regression coefficients for both random effects) could be adopted because it yielded practically the same genetic parameter estimates with parsimony. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Purpose: Refractory frontal lobe epilepsy (FLE) remains one of the most challenging surgically remediable epilepsy syndromes. Nevertheless, definition of independent predictors and predictive models of postsurgical seizure outcome remains poorly explored in FLE. Methods: We retrospectively analyzed data from 70 consecutive patients with refractory FLE submitted to surgical treatment at our center from July 1994 to December 2006. Univariate results were submitted to logistic regression models and Cox proportional hazards regression to identify isolated risk factors for poor surgical results and to construct predictive models for surgical outcome in FLE. Results: From 70 patients submitted to surgery, 45 patients (64%) had favorable outcome and 37 (47%) became seizure free. Isolated risk factors for poor surgical outcome are expressed in hazard ratio (H.R.) and were time of epilepsy (H.R.=4.2; 95% C.I.=.1.5-11.7; p=0.006), ictal EEG recruiting rhythm (H.R. = 2.9; 95% C.I. = 1.1-7.7; p=0.033); normal MRI (H.R. = 4.8; 95% C.I. = 1.4-16.6; p = 0.012), and MRI with lesion involving eloquent cortex (H.R. = 3.8; 95% C.I. = 1.2-12.0; p = 0.021). Based on these variables and using a logistic regression model we constructed a model that correctly predicted long-term surgical outcome in up to 80% of patients. Conclusion: Among independent risk factors for postsurgical seizure outcome, epilepsy duration is a potentially modifiable factor that could impact surgical outcome in FLE. Early diagnosis, presence of an MRI lesion not involving eloquent cortex, and ictal EEG without recruited rhythm independently predicted favorable outcome in this series. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
System thinking allows companies to use subjective constructs indicators like recursiveness, cause-effect relationships and autonomy to performance evaluation. Thus, the question that motivates this paper is: Are Brazilian companies searching new performance measurement and evaluation models based on system thinking? The study investigates models looking for system thinking roots in their framework. It was both exploratory and descriptive based on a multiple four case studies strategy in chemical sector. The findings showed organizational models have some characteristics that can be related to system thinking as system control and communication. Complexity and autonomy are deficiently formalized by the companies. All data suggest, inside its context, that system thinking seems to be adequate to organizational performance evaluation but remains distant from the management proceedings.