717 resultados para China, Capital structure, Dynamic panel data models, Listed property company


Relevância:

100.00% 100.00%

Publicador:

Resumo:

To find the pathologic cause of the children's dental fluorosis in southwestern China, diet structure before the age of 6 and prevalence rate of dental fluorosis (DF) of 405 children were investigated, and the fluorine and arsenic content of several materials were determined. The prevalence rate of DF of children living on roasted corn before the age of 6 is 100% with nearly 95% having the mild to severe DF; while that of children living on non-roasted corn or rice is less than 5% with all having very mild DF. The average fluorine and arsenic concentration are 20.26 mg/kg and 0.249 mg/kg in roasted corn, which are about 16 times and 35 times more than in non-roasted corn, respectively. The average fluorine concentration is 78 mg/kg in coal, 1116 mg/kg in binder clay and 313 mg/kg in briquette (coal mixed with clay). The average arsenic concentration of coal is 5.83 mg/kg, the binder clay is 20.94 mg/kg, with 8.52 mg/kg in the briquette. Living on roasted corn and chili is the main pathologic cause of endemic fluorosis in southwestern China. The main source of fluorine and arsenic pollution of roasted corn and chill is the briquette of coal and binder clay. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The content and isotopic compositions of different sulphur species in pore-water and solid phases have been examined on five sediment cores taken from muddy sediment region in the Yellow Sea and the East China Sea. Relationships among these data have been investigated with the combination of morphology of mineral pyrite and organic matter so as to role out the diagenetic behaviour of sulphur species at the early stage of diagenesis in modern marine sediment and the origin of pyrite formation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The lithosphere behaves as an elastic thin plate overlying a weak fluid. When loaded by topography, it deflects to compensate the topography and isostasy is achieved. The response of the plate is often characterized by the flexural rigidity, or equivalently, by the effective elastic thickness(EET). The relationships between gravity and topography have commonly been employed to investigate quantitively the isostatic compensation of the lithosphere. Cross-spectral techniques have been developed to estimate the admittance and coherence between gravity and topography. The observed admittance and coherence functions, when compared with theoretical admittance and coherence computed for various models and parameters, can provide an estimate of the effective elastic thickness. As for coherence, the wavelength at which the coherence drops from approaching unity at long wavelengths to value approaching zero at short wavelengths is a measure of the rigidity of the lithosphere. This research takes advantage of the high-resolution gravity and topography data in Erdos. Using the coherence technique, we have estimated the effective elastic thickness of this region. Subsurface loads are also considered in our calculation. When the admittance between topography and gravity is obtained, we can filter the topography to give an estimate of gravity anomaly. This would be very important when the gravity data is scarce. Several other regions of China have been selected to investigate the effective elastic thickness of lithosphere. We compare those result with lithosphere thickness obtained through seismological technique and heat flow of the region. We find the effective elastic thickness is always smaller than the seismogenic thickness. This is due to what the effective elastic thickness reflects is only the up elastic part of the lithosphere. And we also find there is some degree of correlation between effective elastic thickness and heat flow. This suggests EET is probably controlled by the thermal state of the lithosphere and correlates with their tectonic age. Thus the estimations of the effective elastic thickness of lithosphere can help to investigate geophysical features of active tectonics of continental lithosphere.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the growing development and perfection of reservoir describing technology, its research achievements have played an increasingly important role in old oilfields in recent years. Reservoir description quantitatively describes, characterizes and predicts every kind of reservoir characters in 3D space. The paper takes Banbei block reservoir as an object, studies the reservoir characters and residual oil distributing characteristics of gravity flow genetic reservoir, and definitudes potential adjustment direction of reservoir development. Main achievements are gained as follows. Through fine correlation of strati graphic sequence, the classification of layers and single sands of main payzones in Banbei block is ascertained, the classifying methods of sedimentary unit in gravity flow reservoir characterized with picked cyclical marker bed are formed. On the basis of comprehensive logging evaluation, depositional characters of Banbei block are studied, and classifying methods of sedimentary microfacies in gravity flow reservoir are described. The sedimentary background of main oil layers in Banbei block is open lake with shallow water, and belongs to lacustrine underwater gravity flow- lacustrine phase depositional system. Main microfacies types are underwater water course^ water course side-wing, underwater floodplain, between two water courses, and lacustrine mud, etc. Reservoir sands mainly are underwater water course sands. Influenced by distributing characters of gravity flow underwater water course, sand shapes in plane mainly are stripe, finger-shape, tongue-shape. Sand distribution shows obvious split property. Sands overlap each other. According to comprehensive analysis of lithologic data, logging parameters, and dynamic production data, the researching threads and methods of reservoir heterogeneous characters are perfected. The depositional characters of gravity flow underwater water course in Banbei block determine its high reservoir heterogeneity. Macroscopic heterogeneity is studied in many aspects such as the scale of layers, the scale of single sands, in-situ scale, the distribution of interlayer types, the interlayer scale, and heterogeneity in plane. Thus, heterogeneous characters of reservoir are thoroughly analyzed. Through microscopic research of reservoir, the types of porous structure and related parameters are determined. According to the analysis of dynamic production data, the reaction and inner influential factors of reservoir heterogeneity in waterflood development are further revealed. Started with the concept and classifying methods of flow unit, clustering classification which can better meet the requirements of production is formed. The flow unit of Banbei block can be classified into four types. According to comprehensive evaluation, the first and second type of flow unit have better percolating capability and reserving capability. Research thread of 3D model-building and reservoir numerical simulation combined as an integral is adopted. The types and characters of residual oil distribution are determined. Residual oil of Banbei block mainly distributes in the boundary of sands, near the faults, areas with non-perfect injection-production well pattern , undeveloped sands, vertically poor developed layers. On the basis of comprehensive reservoir study, the threads and methods of improving development effect towards reservoir with high water cut, high recovery percent, serious heterogeneity are ascertained. The whole waterflood development effect of Banbei block reservoir is good. Although its water cut and recovery percent is relatively high, there is still some potential to develop. According to depositional characters of gravity flow and actual production situation? effective means of further improving development level are as follows. We should drill new wells in every kind of areas abounding with residual oil, implement comprehensive measures such as increasing liquid discharge, cyclic waterflood, changing fluid direction when injection-production well pattern is perfected, improve water quality, enhance displacement efficiency in flooding.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Spatial population data, obtained through the pixeling method, makes many related researches more convenient. However, the limited methods of precision analysis prevent the spread of spatial distribution methods and cumber the application of the spatial population data. This paper systematically analyzes the different aspects of the spatial population data precision, and re-calculates them with the reformed method, which makes breakthrough for the spread of the pixeling method and provides support and reference for the application of spatial population data. The paper consists of the following parts: (2) characters of the error; (2) origins of the error; (3) advancement on the calculating methods of the spatial population data. In the first place, based on the analysis of the error trait, two aspects of the spatial population data precision are characterized and analyzed: numerical character and spatial distributing character. The later one, placed greater emphasis on in this paper, is depicted in two spatial scales: county and town. It is always essential and meaningful to the research in this paper that spatial distribution is as important as numerical value in analyzing error of the spatial distributed data. The result illustrates that the spatial population data error appears spatially in group, although it is random in the aspect of data statistics, all of that shows there lies spatial systematic error. Secondly, this paper comes to conclude and validate the lineal correlation between the residential land area (from 1:50000 map and taken as real area) and population. Meanwhile, it makes particular analysis on the relationship between the residential land area, which is obtained from the land use map and the population in three different spatial scales: village, town and county, and makes quantitative description of the residential density variation in different topological environment. After that, it analyzes the residential distributing traits and precision. With the consideration of the above researches, it reaches the conclusion that the error of the spatial distributed population is caused by a series of factors, such as the compactness of the residents, loss of the residential land, the population density of the city. Eventually, the paper ameliorates the method of pixeling the population data with the help of the analysis on error characters and causes. It tests 2-class regionalization based on the 1-class regionalization of China, and resorts the residential data from the land use map. In aid of GIS and the comprehensive analysis of various data source, it constructs models in each 2-class district to calculate spatial population data. After all, LinYi Region is selected as the study area. In this area, spatial distributing population is calculated and the precision is analyzed. All it illustrates is that new spatial distributing population has been improved much. The research is fundamental work. It adopts large amounts of data in different types and contains many figures to make convincing and detailed conclusions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this research, which focused on the Irish adult population, was to generate information for policymakers by applying statistical analyses and current technologies to oral health administrative and survey databases. Objectives included identifying socio-demographic influences on oral health and utilisation of dental services, comparing epidemiologically-estimated dental treatment need with treatment provided, and investigating the potential of a dental administrative database to provide information on utilisation of services and the volume and types of treatment provided over time. Information was extracted from the claims databases for the Dental Treatment Benefit Scheme (DTBS) for employed adults and the Dental Treatment Services Scheme (DTSS) for less-well-off adults, the National Surveys of Adult Oral Health, and the 2007 Survey of Lifestyle Attitudes and Nutrition in Ireland. Factors associated with utilisation and retention of natural teeth were analysed using count data models and logistic regression. The chi-square test and the student’s t-test were used to compare epidemiologically-estimated need in a representative sample of adults with treatment provided. Differences were found in dental care utilisation and tooth retention by Socio-Economic Status. An analysis of the five-year utilisation behaviour of a 2003 cohort of DTBS dental attendees revealed that age and being female were positively associated with visiting annually and number of treatments. Number of adults using the DTBS increased, and mean number of treatments per patient decreased, between 1997 and 2008. As a percentage of overall treatments, restorations, dentures, and extractions decreased, while prophylaxis increased. Differences were found between epidemiologically-estimated treatment need and treatment provided for those using the DTBS and DTSS. This research confirms the utility of survey and administrative data to generate knowledge for policymakers. Public administrative databases have not been designed for research purposes, but they have the potential to provide a wealth of knowledge on treatments provided and utilisation patterns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article, we offer a new way of exploring relationships between three different dimensions of a business operation, namely the stage of business development, the methods of creativity and the major cultural values. Although separately, each of these has gained enormous attention from the management research community, evidenced by a large volume of research studies, there have been not many studies that attempt to describe the logic that connect these three important aspects of a business; let alone empirical evidences that support any significant relationships among these variables. The paper also provides a data set and an empirical investigation on that data set, using a categorical data analysis, to conclude that examinations of these possible relationships are meaningful and possible for seemingly unquantifiable information. The results also show that the most significant category among all creativity methods employed in Vietnamese enterprises is the “creative disciplines” rule in the “entrepreneurial phase,” while in general creative disciplines have played a critical role in explaining the structure of our data sample, for both stages of development in our consideration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The application of semantic technologies to the integration of biological data and the interoperability of bioinformatics analysis and visualization tools has been the common theme of a series of annual BioHackathons hosted in Japan for the past five years. Here we provide a review of the activities and outcomes from the BioHackathons held in 2011 in Kyoto and 2012 in Toyama. In order to efficiently implement semantic technologies in the life sciences, participants formed various sub-groups and worked on the following topics: Resource Description Framework (RDF) models for specific domains, text mining of the literature, ontology development, essential metadata for biological databases, platforms to enable efficient Semantic Web technology development and interoperability, and the development of applications for Semantic Web data. In this review, we briefly introduce the themes covered by these sub-groups. The observations made, conclusions drawn, and software development projects that emerged from these activities are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the relationship between exposure to conflict and poverty dynamics over time, using original three-waves panel data for Burundi which tracked individuals and reported local-level violence exposure from 1998 to 2012. Firstly, the data reveal that headcount poverty has not changed since 1998 while we observe multiple transitions into and out of poverty. Moreover, households exposed to the war exhibit a lower level of welfare than non-exposed households, with the difference between the two groups predicted to remain significant at least until 2017, i.e. twelve years after the conflict termination. The correlation between violence exposure and deprivation over time is confirmed in a household-level panel setting. Secondly, our empirical investigation shows how violence exposure over different time spans interacts with households' subsequent welfare. Our analysis of the determinants of households' likelihood to switch poverty status (i.e. to fall into poverty or escape poverty) combined with quantile regressions suggest that, (i) exposure during the first phase of the conflict has affected the entire distribution, and (ii) exposure during the second phase of the conflict has mostly affected the upper tail of the distribution: initially non-poor households have a higher propensity to fall into poverty while initially poor households see their propensity to pull through only slightly decrease with recent exposure to violence. Although not directly testable with the data at hand, these results are consistent with the changing nature of violence in the course of the Burundi civil war, from relatively more labour-destructive to relatively more capital-destructive.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The electric current and the associated magnetic field in aluminium electrolysis cells create effects limiting the cell productivity and possibly cause instabilities: surface waving, ‘anode effects’, erosion of pot lining, feed material sedimentation, etc. The instructive analysis is presented via a step by step inclusion of different physical coupling factors affecting the magnetic field, electric current, velocity and wave development in the electrolysis cells. The full time dependent model couples the nonlinear turbulent fluid dynamics and the extended electromagnetic field in the cell, and the whole bus bar circuit with the ferromagnetic effects. Animated examples for the high amperage cells are presented. The theory and numerical model of the electrolysis cell is extended to the cases of variable cell bottom of aluminium layer and the variable thickness of the electrolyte due to the anode non-uniform burn-out process and the presence of the anode channels. The problem of the channel importance is well known Moreau-Evans model) for the stationary interface and the velocity field, and was validated against measurements in commercial cells, particularly with the recently published ‘benchmark’ test for the MHD models of aluminium cells [1]. The presence of electrolyte channels requires also to reconsider the previous magnetohydrodynamic instability theories and the dynamic wave development models. The results indicate the importance of a ‘sloshing’ parametrically excited MHD wave development in the aluminium production cells.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accurate assessment of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and a methodology to quantify all major components of the global carbon budget, including their uncertainties, based on the combination of a range of data, algorithms, statistics, and model estimates and their interpretation by a broad scientific community. We discuss changes compared to previous estimates, consistency within and among components, alongside methodology and data limitations. CO2 emissions from fossil fuel combustion and cement production (E-FF) are based on energy statistics and cement production data, respectively, while emissions from land-use change (E-LUC), mainly deforestation, are based on combined evidence from land-cover-change data, fire activity associated with deforestation, and models. The global atmospheric CO2 concentration is measured directly and its rate of growth (G(ATM)) is computed from the annual changes in concentration. The mean ocean CO2 sink (S-OCEAN) is based on observations from the 1990s, while the annual anomalies and trends are estimated with ocean models. The variability in S-OCEAN is evaluated with data products based on surveys of ocean CO2 measurements. The global residual terrestrial CO2 sink (S-LAND) is estimated by the difference of the other terms of the global carbon budget and compared to results of independent dynamic global vegetation models forced by observed climate, CO2, and land-cover-change (some including nitrogen-carbon interactions). We compare the mean land and ocean fluxes and their variability to estimates from three atmospheric inverse methods for three broad latitude bands. All uncertainties are reported as +/- 1 sigma, reflecting the current capacity to characterise the annual estimates of each component of the global carbon budget. For the last decade available (2004-2013), E-FF was 8.9 +/- 0.4 GtC yr(-1), E-LUC 0.9 +/- 0.5 GtC yr(-1), G(ATM) 4.3 +/- 0.1 GtC yr(-1), S-OCEAN 2.6 +/- 0.5 GtC yr(-1), and S-LAND 2.9 +/- 0.8 GtC yr(-1). For year 2013 alone, E-FF grew to 9.9 +/- 0.5 GtC yr(-1), 2.3% above 2012, continuing the growth trend in these emissions, E-LUC was 0.9 +/- 0.5 GtC yr(-1), G(ATM) was 5.4 +/- 0.2 GtC yr(-1), S-OCEAN was 2.9 +/- 0.5 GtC yr(-1), and S-LAND was 2.5 +/- 0.9 GtC yr(-1). G(ATM) was high in 2013, reflecting a steady increase in E-FF and smaller and opposite changes between S-OCEAN and S-LAND compared to the past decade (2004-2013). The global atmospheric CO2 concentration reached 395.31 +/- 0.10 ppm averaged over 2013. We estimate that E-FF will increase by 2.5% (1.3-3.5 %) to 10.1 +/- 0.6 GtC in 2014 (37.0 +/- 2.2 GtCO(2) yr(-1)), 65% above emissions in 1990, based on projections of world gross domestic product and recent changes in the carbon intensity of the global economy. From this projection of E-FF and assumed constant E-LUC for 2014, cumulative emissions of CO2 will reach about 545 +/- 55 GtC (2000 +/- 200 GtCO(2)) for 1870-2014, about 75% from E-FF and 25% from E-LUC. This paper documents changes in the methods and data sets used in this new carbon budget compared with previous publications of this living data set (Le Quere et al., 2013, 2014). All observations presented here can be downloaded from the Carbon Dioxide Information Analysis Center (doi:10.3334/CDIAC/GCP_2014).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ocean biogeochemistry (OBGC) models span a wide variety of complexities, including highly simplified nutrient-restoring schemes, nutrient–phytoplankton–zooplankton–detritus (NPZD) models that crudely represent the marine biota, models that represent a broader trophic structure by grouping organisms as plankton functional types (PFTs) based on their biogeochemical role (dynamic green ocean models) and ecosystem models that group organisms by ecological function and trait. OBGC models are now integral components of Earth system models (ESMs), but they compete for computing resources with higher resolution dynamical setups and with other components such as atmospheric chemistry and terrestrial vegetation schemes. As such, the choice of OBGC in ESMs needs to balance model complexity and realism alongside relative computing cost. Here we present an intercomparison of six OBGC models that were candidates for implementation within the next UK Earth system model (UKESM1). The models cover a large range of biological complexity (from 7 to 57 tracers) but all include representations of at least the nitrogen, carbon, alkalinity and oxygen cycles. Each OBGC model was coupled to the ocean general circulation model Nucleus for European Modelling of the Ocean (NEMO) and results from physically identical hindcast simulations were compared. Model skill was evaluated for biogeochemical metrics of global-scale bulk properties using conventional statistical techniques. The computing cost of each model was also measured in standardised tests run at two resource levels. No model is shown to consistently outperform all other models across all metrics. Nonetheless, the simpler models are broadly closer to observations across a number of fields and thus offer a high-efficiency option for ESMs that prioritise high-resolution climate dynamics. However, simpler models provide limited insight into more complex marine biogeochemical processes and ecosystem pathways, and a parallel approach of low-resolution climate dynamics and high-complexity biogeochemistry is desirable in order to provide additional insights into biogeochemistry–climate interactions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper. we present collision strengths and Maxwellian averaged effective collision strengths for the electron-impact excitation of Fe II. We consider specifically the optically allowed lines for transitions from the 3d(6)4s and 3d(7) even parity configuration states to the 3d(6)4p odd parity configuration levels. The parallel suite of Breit-Pauli codes are utilized to compute the collision cross-sections where relativistic effects are included explicitly in both the target and the scattering approximation. A total of 100 LS or 262-jj levels formed from the basis configurations 3d(6)4s, 3d(7) and 3d(6)4p were included in the wave-function representation of the target, including all doublet. quartet and sextet terms. The Maxwellian averaged effective collision strengths are computed across a wide range of electron temperatures from 100 to 100,000 K, temperatures of importance in astrophysical and plasma applications. A detailed comparison is made with previous works and significant differences were found to occur for some of the transitions considered. We conclude that in order to obtain converged collision strengths and effective collision strengths for these allowed transitions it is necessary to include contributions from partial waves up to L = 50 explicitly in the calculation, and in addition, account for contributions from even higher partial waves through a "top up" procedure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Web sites that rely on databases for their content are now ubiquitous. Query result pages are dynamically generated from these databases in response to user-submitted queries. Automatically extracting structured data from query result pages is a challenging problem, as the structure of the data is not explicitly represented. While humans have shown good intuition in visually understanding data records on a query result page as displayed by a web browser, no existing approach to data record extraction has made full use of this intuition. We propose a novel approach, in which we make use of the common sources of evidence that humans use to understand data records on a displayed query result page. These include structural regularity, and visual and content similarity between data records displayed on a query result page. Based on these observations we propose new techniques that can identify each data record individually, while ignoring noise items, such as navigation bars and adverts. We have implemented these techniques in a software prototype, rExtractor, and tested it using two datasets. Our experimental results show that our approach achieves significantly higher accuracy than previous approaches. Furthermore, it establishes the case for use of vision-based algorithms in the context of data extraction from web sites.